49 resultados para Electronics in criminal investigation
Resumo:
The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.
Resumo:
Fraud is as old as Mankind. There are an enormous number of historical documents which show the interaction between truth and untruth; therefore it is not really surprising that the prevalence of publication discrepancies is increasing. More surprising is that new cases especially in the medical field generate such a huge astonishment. In financial mathematics a statistical tool for detection of fraud is known which uses the knowledge of Newcomb and Benford regarding the distribution of natural numbers. This distribution is not equal and lower numbers are more likely to be detected compared to higher ones. In this investigation all numbers contained in the blinded abstracts of the 2009 annual meeting of the Swiss Society of Anesthesia and Resuscitation (SGAR) were recorded and analyzed regarding the distribution. A manipulated abstract was also included in the investigation. The χ(2)-test was used to determine statistical differences between expected and observed counts of numbers. There was also a faked abstract integrated in the investigation. A p<0.05 was considered significant. The distribution of the 1,800 numbers in the 77 submitted abstracts followed Benford's law. The manipulated abstract was detected by statistical means (difference in expected versus observed p<0.05). Statistics cannot prove whether the content is true or not but can give some serious hints to look into the details in such conspicuous material. These are the first results of a test for the distribution of numbers presented in medical research.
Actualités en gastroentérologie et hépatologie [Highlights in gastroenterology and hepatology 2010].
Resumo:
This review highlights recent advances in gastroenterology and hepatology, including the treatment of Crohn's disease, of eosinophilic esophagitis, of chronic hepatitis C, and of hepatic encephalopathy as well as the role of high resolution manometry in the investigation of esophageal motility disorders. These new developments will be summarized and discussed critically, with a particular emphasis on their potential implications for current and future clinical practice.
Resumo:
Micas are commonly used in Ar-40/Ar-39 thermochronological studies of variably deformed rocks yet the physical basis by which deformation may affect radiogenic argon retention in mica is poorly constrained. This study examines the relationship between deformation and deformation-induced microstructures on radiogenic argon retention in muscovite, A combination of furnace step-heating and high-spatial resolution in situ UV-laser ablation Ar-40/Ar-39 analyses are reported for deformed muscovites sampled from a granitic pegmatite vein within the Siviez-Mischabel Nappe, western Swiss Alps (Penninic domain, Brianconnais unit). The pegmatite forms part of the Variscan (similar to 350 Ma) Alpine basement and exhibits a prominent Alpine S-C fabric including numerous mica `fish' that developed under greenschist facies metamorphic conditions, during the dominant Tertiary Alpine tectonic phase of nappe emplacement. Furnace step-heating of milligram quantities of separated muscovite grains yields an Ar-40/Ar-39 age spectrum with two distinct staircase segments but without any statistical plateau, consistent with a previous study from the same area. A single (3 X 5 mm) muscovite porphyroclast (fish) was investigated by in situ UV-laser ablation. A histogram plot of 170 individual Ar-40/Ar-39 UV-laser ablation ages exhibit a range from 115 to 387 Ma with modes at approximately 340 and 260 Ma. A variogram statistical treatment of the (40)Ad/Ar-39 results reveals ages correlated with two directions; a highly correlated direction at 310 degrees and a lesser correlation at 0 degrees relative to the sense of shearing. Using the highly correlated direction a statistically generated (Kriging method) age contour map of the Ar-40/Ar-39 data reveals a series of elongated contours subparallel to the C-surfaces which where formed during Tertiary nappe emplacement. Similar data distributions and slightly younger apparent ages are recognized in a smaller mica fish. The observed intragrain age variations are interpreted to reflect the partial loss of radiogenic argon during Alpine (similar to 35 Ma) greenschist facies metamorphism. One-dirnensional diffusion modelling results are consistent with the idea that the zones of youngest apparent age represent incipient shear band development within the mica porphyroclasts, thus providing a network of fast diffusion pathways. During Alpine greenschist facies metamorphism the incipient shear bands enhanced the intragrain loss of radiogenic argon. The structurally controlled intragrain age variations observed in this investigation imply that deformation has a direct control on the effective length scale for argon diffusion, which is consistent with the heterogeneous nature of deformation. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
SUMMARY (Français au-dessous)After the Second World War, the role of the victim in criminal conflict became an objectof interest for academics. But it was only in the 1960s that the importance of providingprotection and assistance to crime victims was highlighted in particular by the victims'movement, which inaugurated a new era of criminal justice in systems throughout the world.Moving beyond just the role of controlling crime and punishing the offender, the criminaljustice system also began to contribute to the victims' rehabilitation and to help the victim tomove on from the event psychologically and emotionally.Although some criminological research has been conducted, to date the effect that thecriminal justice system and victim support services have on the well-being of crime victims isstill uncertain.The current study sought to understand better the healing process of victims of crime, thepotential consequences of their participation on the criminal justice system, and the supportof victim centers. Moreover, it aimed to find out whether the existence of a Victim SupportAct would change the treatment that the victim receives in the criminal justice system. Thusthis research was conducted based in two countries - Switzerland and Brazil - where theoutcome of the victims' movement on the criminal justice system was different, as was theparticipation of the victim in the criminal justice system and the government's provision ofsupport.In order to conduct this research we employed the qualitative method, which is the mostefficient to gather sensitive information. Interviews with crime victims were the main sourceof information. Hearing observation and document research were used as complementarysources.The results of this research show that victims who have contact with the criminal justicesystem and victim services are not more likely to recover than those who had no contact. Thisis to say, the support offered has no major effects; the influence of the criminal justice systemand the victim support services in the emotional well-being of crime victims is rather neutral.However, considering that the sample is not representative, findings are not expected to begeneralized. Instead, findings may give insight to practitioners or to future criminal justicepolicy makers, suggesting what may work to improve the emotional well-being of crimevictims, as well as suggesting further studies.________________________________________________________________________________RÉSUMÉAprès la deuxième guerre mondiale, le rôle de la victime est devenu un objet d'intérêtpour les académiciens. Par contre, c'est seulement dans les années 60 que l'importance defournir de la protection et de l'appui aux victimes d'infractions a été accentuée, en particulierpar un mouvement ― victims' mouvement ―, qui a inauguré un nouveau temps dans lajustice pénale des systèmes juridiques du monde entier. A part la fonction de contrôler lecrime et de punir le délinquant, le système de justice pénale joue également un rôle dans laréhabilitation des victimes.Malgré la réalisation de plusieurs recherches criminologiques sur ce sujet, les effets que lesystème de la justice pénale et les centres d'aides aux victimes ont sur le bien-être desvictimes d'infractions est encore incertain.Ainsi cette étude cherche à mieux comprendre le processus de réhabilitation des victimesd'infraction, les conséquences de leur participation dans le système de justice pénale ainsique la portée de l'appui des centres d'aide. De plus, l'étude vise à découvrir si l'existenced'une loi d'aide aux victimes, particulièrement la Loi d'Aide aux Victimes d'InfractionsLAVI, est susceptible de changer le traitement que la victime reçoit dans le système de lajustice pénale. Pour cela, elle a été conduite dans deux pays - la Suisse et le Brésil - où lesconséquences du mouvement des victimes sur le système de la justice pénale a eu undéveloppement différent; il en va de même pour la participation de la victime dans laprocédure pénale et pour l'appui offert par l'Etat.Cette étude utilise la méthode qualitative qui est la plus efficace pour le recueild'informations sensibles. La plus importante source des données sont les interviews avec lesvictimes. L'observation des audiences et l'analyse de documents ont été utilisés en tant quesources d'information complementáire.Les résultats de cette recherche montrent que les victimes qui ont porté plainte et qui ontreçu l'appui des centres d'aides ne sont pas mieux rétablies que celles qui n'ont rien fait. C'estainsi que nous avons conclu que les services offerts n'ont aucune influence dans ce processus.Cependant, considérant que notre échantillon n'est pas représentatif, il n'est pas possible degénéraliser nos résultats. Néanmoins, ceux-ci peuvent éclairer les praticiens ou les futursdécideurs politiques de la justice pénale, suggérant ce qui peut fonctionner pour lerétablissement des victimes d'infraction, aussi bien que suggérer d'autres études.
Resumo:
Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.
Resumo:
We analyzed the initial adhesion and biofilm formation of Staphylococcus aureus (ATCC 29213) and S. epidermidis RP62A (ATCC 35984) on various bone grafts and bone graft substitutes under standardized in vitro conditions. In parallel, microcalorimetry was evaluated as a real-time microbiological assay in the investigation of biofilm formation and material science research. The materials beta-tricalcium phosphate (beta-TCP), processed human spongiosa (Tutoplast) and poly(methyl methacrylate) (PMMA) were investigated and compared with polyethylene (PE). Bacterial counts (log(10) cfu per sample) were highest on beta-TCP (S. aureus 7.67 +/- 0.17; S. epidermidis 8.14 +/- 0.05) while bacterial density (log(10) cfu per surface) was highest on PMMA (S. aureus 6.12 +/- 0.2, S. epidermidis 7.65 +/- 0.13). Detection time for S. aureus biofilms was shorter for the porous materials (beta-TCP and processed human spongiosa, p < 0.001) compared to the smooth materials (PMMA and PE), with no differences between beta-TCP and processed human spongiosa (p > 0.05) or PMMA and PE (p > 0.05). In contrast, for S. epidermidis biofilms the detection time was different (p < 0.001) between all materials except between processed human spongiosa and PE (p > 0.05). The quantitative analysis by quantitative culture after washing and sonication of the material demonstrated the importance of monitoring factors like specific surface or porosity of the test materials. Isothermal microcalorimetry proved to be a suitable tool for an accurate, non-invasive and real-time microbiological assay, allowing the detection of bacterial biomass without removing the biofilm from the surface.
Resumo:
Protein S (ProS) is an important negative regulator of blood coagulation. Its physiological importance is evident in purpura fulminans and other life-threatening thrombotic disorders typical of ProS deficient patients. Our previous characterization of ProS deficiency in mouse models has shown similarities with the human phenotypes: heterozygous ProS-deficient mice (Pros+/-) had increased thrombotic risk whereas homozygous deficiency in ProS (Pros-/-) was incompatible with life (Blood 2009; 114:2307-2314). In tissues, ProS exerts cellular functions by binding to and activating tyrosine kinase receptors of the Tyro3 family (TAM) on the cell surface.To extend the analysis of coagulation defects beyond the Pros-/- phenotype and add new insights into the sites of synthesis ProS and its action, we generated mice with inactivated ProS in hepatocytes (Proslox/loxAlbCre+) as well as in endothelial and hematopoietic cells (Proslox/loxTie2Cre+). Both models resulted in significant reduction of circulating ProS levels and in a remarkable increased thrombotic risk in vivo. In a model of tissue factor (TF)-induced venous thromboembolism (VTE), only 17% of Proslox/loxAlbCre+ mice (n=12) and only 13% of Proslox/loxTie2Cre+ mice (n=14) survived, compared with 86% of Proslox/lox mice (n=14; P<0.001).To mimic a severe acquired ProS deficiency, ProS gene was inactivated at the adult stage using the polyI:C-inducible Mx1-Cre system (Proslox/loxMx1Cre+). Ten days after polyI:C treatment, Proslox/loxMx1Cre+ mice developed disseminated intravascular coagulation with extensive lung and liver thrombosis.It is worth noting that no skin lesions compatible with purpura fulminans were observed in any of the above-described models of partial ProS deficiency. In order to shed light on the pathogenesis of purpura fulminans, we exposed the different ProS-deficient mice to warfarin (0.2 mg/day). We observed that Pros+/-, Proslox/loxAlbCre+ and Proslox/loxTie2Cre+ mice developed retiform purpura (characterized by erythematous and necrotic lesions of the genital region and extremities) and died after 3 to 5 days after the first warfarin administration.In human, ProS is also synthesized by megakaryocytes and hence stored at high concentrations in circulating platelets (pProS). The role of pProS has been investigated by generating megakaryocyte ProS-deficient model using the PF4 promoter as Cre driver (Proslox/loxPf4Cre+). In the TF-induced VTE model, Proslox/loxPf4Cre+ (n=15) mice showed a significant increased risk of thrombosis compared to Proslox/lox controls (n=14; survival rate 47% and 86%, respectively; P<0.05). Furthermore, preliminary results suggest survival to be associated with higher circulating ProS levels. In order to evaluate the potential role of pProS in thrombus formation, we investigated the thrombotic response to intravenous injection of collagen-epinephrine in vivo and platelet function in vitro. Both in vivo and in vitro experiments showed similar results between Proslox/loxPf4Cre+ and Proslox/lox, indicating that platelet reactivity was not influenced by the absence of pProS. These data suggest that pProS is delivered at the site of thrombosis to inhibit thrombin generation.We further investigated the ability of ProS to function as a ligand of TAM receptors, by using homozygous and heterozygous deficient mice for both the TAM ligands ProS and Gas6. Gas6-/-Pros-/- mice died in utero and showed comparable dramatic bleeding and thrombotic phenotype as described for Pros-/- embryos.In conclusion, like complete ProS deficiency, double deficiency in ProS and Gas6 was lethal, whereas partial ProS deficiency was not. Mice partially deficient in ProS displayed a prothrombotic phenotype, including those with only deficiency in pProS. Purpura fulminans did not occur spontaneously in mice with partial Pros deficiency but developed upon warfarin administration.Thus, the use of different mice models of ProS deficiency can be instrumental in the study of its highly variable thrombotic phenotype and in the investigation of additional roles of ProS in inflammation and autoimmunity through TAM signaling.
Resumo:
A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.
Resumo:
Glioma has been considered resistant to chemotherapy and radiation. Recently, concomitant and adjuvant chemoradiotherapy with temozolomide has become the standard treatment for newly diagnosed glioblastoma. Conversely (neo-)adjuvant PCV (procarbazine, lomustine, vincristine) failed to improve survival in the more chemoresponsive tumor entities of anaplastic oligoastrocytoma and oligodendroglioma. Preclinical investigations suggest synergism or additivity of radiotherapy and temozolomide in glioma cell lines. Although the relative contribution of the concomitant and the adjuvant chemotherapy, respectively, cannot be assessed, the early introduction of chemotherapy and the simultaneous administration with radiotherapy appear to be key for the improvement of outcome. Epigenetic inactivation of the DNA repair enzyme methylguanine methyltransferase (MGMT) seems to be the strongest predictive marker for outcome in patients treated with alkylating agent chemotherapy. Patients whose tumors do not have MGMT promoter methylation are less likely to benefit from the addition of temozolomide chemotherapy and require alternative treatment strategies. The predictive value of MGMT gene promoter methylation is being validated in ongoing trials aiming at overcoming this resistance by a dose-dense continuous temozolomide administration or in combination with MGMT inhibitors. Understanding of molecular mechanisms allows for rational targeting of specific pathways of repair, signaling, and angiogenesis. The addition of tyrosine kinase inhibitors vatalanib (PTK787) and vandetinib (ZD6474), the integrin inhibitor cilengitide, the monoclonal antibodies bevacizumab and cetuximab, the mammalian target of rapamycin inhibitors temsirolimus and everolimus, and the protein kinase C inhibitor enzastaurin, among other agents, are in clinical investigation, building on the established chemoradiotherapy regimen for newly diagnosed glioblastoma.
Resumo:
In this investigation, high-resolution, 1x1x1-mm(3) functional magnetic resonance imaging (fMRI) at 7 T is performed using a multichannel array head coil and a surface coil approach. Scan geometry was optimized for each coil separately to exploit the strengths of both coils. Acquisitions with the surface coil focused on partial brain coverage, while whole-brain coverage fMRI experiments were performed with the array head coil. BOLD sensitivity in the occipital lobe was found to be higher with the surface coil than with the head array, suggesting that restriction of signal detection to the area of interest may be beneficial for localized activation studies. Performing independent component analysis (ICA) decomposition of the fMRI data, we consistently detected BOLD signal changes and resting state networks. In the surface coil data, a small negative BOLD response could be detected in these resting state network areas. Also in the data acquired with the surface coil, two distinct components of the positive BOLD signal were consistently observed. These two components were tentatively assigned to tissue and venous signal changes.
Resumo:
AbstractAlthough the genomes from any two human individuals are more than 99.99% identical at the sequence level, some structural variation can be observed. Differences between genomes include single nucleotide polymorphism (SNP), inversion and copy number changes (gain or loss of DNA). The latter can range from submicroscopic events (CNVs, at least 1kb in size) to complete chromosomal aneuploidies. Small copy number variations have often no (lethal) consequences to the cell, but a few were associated to disease susceptibility and phenotypic variations. Larger re-arrangements (i.e. complete chromosome gain) are frequently associated with more severe consequences on health such as genomic disorders and cancer. High-throughput technologies like DNA microarrays enable the detection of CNVs in a genome-wide fashion. Since the initial catalogue of CNVs in the human genome in 2006, there has been tremendous interest in CNVs both in the context of population and medical genetics. Understanding CNV patterns within and between human populations is essential to elucidate their possible contribution to disease. But genome analysis is a challenging task; the technology evolves rapidly creating needs for novel, efficient and robust analytical tools which need to be compared with existing ones. Also, while the link between CNV and disease has been established, the relative CNV contribution is not fully understood and the predisposition to disease from CNVs of the general population has not been yet investigated.During my PhD thesis, I worked on several aspects related to CNVs. As l will report in chapter 3, ! was interested in computational methods to detect CNVs from the general population. I had access to the CoLaus dataset, a population-based study with more than 6,000 participants from the Lausanne area. All these individuals were analysed on SNP arrays and extensive clinical information were available. My work explored existing CNV detection methods and I developed a variety of metrics to compare their performance. Since these methods were not producing entirely satisfactory results, I implemented my own method which outperformed two existing methods. I also devised strategies to combine CNVs from different individuals into CNV regions.I was also interested in the clinical impact of CNVs in common disease (chapter 4). Through an international collaboration led by the Centre Hospitalier Universitaire Vaudois (CHUV) and the Imperial College London I was involved as a main data analyst in the investigation of a rare deletion at chromosome 16p11 detected in obese patients. Specifically, we compared 8,456 obese patients and 11,856 individuals from the general population and we found that the deletion was accounting for 0.7% of the morbid obesity cases and was absent in healthy non- obese controls. This highlights the importance of rare variants with strong impact and provides new insights in the design of clinical studies to identify the missing heritability in common disease.Furthermore, I was interested in the detection of somatic copy number alterations (SCNA) and their consequences in cancer (chapter 5). This project was a collaboration initiated by the Ludwig Institute for Cancer Research and involved other groups from the Swiss Institute of Bioinformatics, the CHUV and Universities of Lausanne and Geneva. The focus of my work was to identify genes with altered expression levels within somatic copy number alterations (SCNA) in seven metastatic melanoma ceil lines, using CGH and SNP arrays, RNA-seq, and karyotyping. Very few SCNA genes were shared by even two melanoma samples making it difficult to draw any conclusions at the individual gene level. To overcome this limitation, I used a network-guided analysis to determine whether any pathways, defined by amplified or deleted genes, were common among the samples. Six of the melanoma samples were potentially altered in four pathways and five samples harboured copy-number and expression changes in components of six pathways. In total, this approach identified 28 pathways. Validation with two external, large melanoma datasets confirmed all but three of the detected pathways and demonstrated the utility of network-guided approaches for both large and small datasets analysis.RésuméBien que le génome de deux individus soit similaire à plus de 99.99%, des différences de structure peuvent être observées. Ces différences incluent les polymorphismes simples de nucléotides, les inversions et les changements en nombre de copies (gain ou perte d'ADN). Ces derniers varient de petits événements dits sous-microscopiques (moins de 1kb en taille), appelés CNVs (copy number variants) jusqu'à des événements plus large pouvant affecter des chromosomes entiers. Les petites variations sont généralement sans conséquence pour la cellule, toutefois certaines ont été impliquées dans la prédisposition à certaines maladies, et à des variations phénotypiques dans la population générale. Les réarrangements plus grands (par exemple, une copie additionnelle d'un chromosome appelée communément trisomie) ont des répercutions plus grave pour la santé, comme par exemple dans certains syndromes génomiques et dans le cancer. Les technologies à haut-débit telle les puces à ADN permettent la détection de CNVs à l'échelle du génome humain. La cartographie en 2006 des CNV du génome humain, a suscité un fort intérêt en génétique des populations et en génétique médicale. La détection de différences au sein et entre plusieurs populations est un élément clef pour élucider la contribution possible des CNVs dans les maladies. Toutefois l'analyse du génome reste une tâche difficile, la technologie évolue très rapidement créant de nouveaux besoins pour le développement d'outils, l'amélioration des précédents, et la comparaison des différentes méthodes. De plus, si le lien entre CNV et maladie a été établit, leur contribution précise n'est pas encore comprise. De même que les études sur la prédisposition aux maladies par des CNVs détectés dans la population générale n'ont pas encore été réalisées.Pendant mon doctorat, je me suis concentré sur trois axes principaux ayant attrait aux CNV. Dans le chapitre 3, je détaille mes travaux sur les méthodes d'analyses des puces à ADN. J'ai eu accès aux données du projet CoLaus, une étude de la population de Lausanne. Dans cette étude, le génome de plus de 6000 individus a été analysé avec des puces SNP et de nombreuses informations cliniques ont été récoltées. Pendant mes travaux, j'ai utilisé et comparé plusieurs méthodes de détection des CNVs. Les résultats n'étant pas complètement satisfaisant, j'ai implémenté ma propre méthode qui donne de meilleures performances que deux des trois autres méthodes utilisées. Je me suis aussi intéressé aux stratégies pour combiner les CNVs de différents individus en régions.Je me suis aussi intéressé à l'impact clinique des CNVs dans le cas des maladies génétiques communes (chapitre 4). Ce projet fut possible grâce à une étroite collaboration avec le Centre Hospitalier Universitaire Vaudois (CHUV) et l'Impérial College à Londres. Dans ce projet, j'ai été l'un des analystes principaux et j'ai travaillé sur l'impact clinique d'une délétion rare du chromosome 16p11 présente chez des patients atteints d'obésité. Dans cette collaboration multidisciplinaire, nous avons comparés 8'456 patients atteint d'obésité et 11 '856 individus de la population générale. Nous avons trouvés que la délétion était impliquée dans 0.7% des cas d'obésité morbide et était absente chez les contrôles sains (non-atteint d'obésité). Notre étude illustre l'importance des CNVs rares qui peuvent avoir un impact clinique très important. De plus, ceci permet d'envisager une alternative aux études d'associations pour améliorer notre compréhension de l'étiologie des maladies génétiques communes.Egalement, j'ai travaillé sur la détection d'altérations somatiques en nombres de copies (SCNA) et de leurs conséquences pour le cancer (chapitre 5). Ce projet fut une collaboration initiée par l'Institut Ludwig de Recherche contre le Cancer et impliquant l'Institut Suisse de Bioinformatique, le CHUV et les Universités de Lausanne et Genève. Je me suis concentré sur l'identification de gènes affectés par des SCNAs et avec une sur- ou sous-expression dans des lignées cellulaires dérivées de mélanomes métastatiques. Les données utilisées ont été générées par des puces ADN (CGH et SNP) et du séquençage à haut débit du transcriptome. Mes recherches ont montrées que peu de gènes sont récurrents entre les mélanomes, ce qui rend difficile l'interprétation des résultats. Pour contourner ces limitations, j'ai utilisé une analyse de réseaux pour définir si des réseaux de signalisations enrichis en gènes amplifiés ou perdus, étaient communs aux différents échantillons. En fait, parmi les 28 réseaux détectés, quatre réseaux sont potentiellement dérégulés chez six mélanomes, et six réseaux supplémentaires sont affectés chez cinq mélanomes. La validation de ces résultats avec deux larges jeux de données publiques, a confirmée tous ces réseaux sauf trois. Ceci démontre l'utilité de cette approche pour l'analyse de petits et de larges jeux de données.Résumé grand publicL'avènement de la biologie moléculaire, en particulier ces dix dernières années, a révolutionné la recherche en génétique médicale. Grâce à la disponibilité du génome humain de référence dès 2001, de nouvelles technologies telles que les puces à ADN sont apparues et ont permis d'étudier le génome dans son ensemble avec une résolution dite sous-microscopique jusque-là impossible par les techniques traditionnelles de cytogénétique. Un des exemples les plus importants est l'étude des variations structurales du génome, en particulier l'étude du nombre de copies des gènes. Il était établi dès 1959 avec l'identification de la trisomie 21 par le professeur Jérôme Lejeune que le gain d'un chromosome supplémentaire était à l'origine de syndrome génétique avec des répercussions graves pour la santé du patient. Ces observations ont également été réalisées en oncologie sur les cellules cancéreuses qui accumulent fréquemment des aberrations en nombre de copies (telles que la perte ou le gain d'un ou plusieurs chromosomes). Dès 2004, plusieurs groupes de recherches ont répertorié des changements en nombre de copies dans des individus provenant de la population générale (c'est-à-dire sans symptômes cliniques visibles). En 2006, le Dr. Richard Redon a établi la première carte de variation en nombre de copies dans la population générale. Ces découvertes ont démontrées que les variations dans le génome était fréquentes et que la plupart d'entre elles étaient bénignes, c'est-à-dire sans conséquence clinique pour la santé de l'individu. Ceci a suscité un très grand intérêt pour comprendre les variations naturelles entre individus mais aussi pour mieux appréhender la prédisposition génétique à certaines maladies.Lors de ma thèse, j'ai développé de nouveaux outils informatiques pour l'analyse de puces à ADN dans le but de cartographier ces variations à l'échelle génomique. J'ai utilisé ces outils pour établir les variations dans la population suisse et je me suis consacré par la suite à l'étude de facteurs pouvant expliquer la prédisposition aux maladies telles que l'obésité. Cette étude en collaboration avec le Centre Hospitalier Universitaire Vaudois a permis l'identification d'une délétion sur le chromosome 16 expliquant 0.7% des cas d'obésité morbide. Cette étude a plusieurs répercussions. Tout d'abord elle permet d'effectuer le diagnostique chez les enfants à naître afin de déterminer leur prédisposition à l'obésité. Ensuite ce locus implique une vingtaine de gènes. Ceci permet de formuler de nouvelles hypothèses de travail et d'orienter la recherche afin d'améliorer notre compréhension de la maladie et l'espoir de découvrir un nouveau traitement Enfin notre étude fournit une alternative aux études d'association génétique qui n'ont eu jusqu'à présent qu'un succès mitigé.Dans la dernière partie de ma thèse, je me suis intéressé à l'analyse des aberrations en nombre de copies dans le cancer. Mon choix s'est porté sur l'étude de mélanomes, impliqués dans le cancer de la peau. Le mélanome est une tumeur très agressive, elle est responsable de 80% des décès des cancers de la peau et est souvent résistante aux traitements utilisés en oncologie (chimiothérapie, radiothérapie). Dans le cadre d'une collaboration entre l'Institut Ludwig de Recherche contre le Cancer, l'Institut Suisse de Bioinformatique, le CHUV et les universités de Lausanne et Genève, nous avons séquencés l'exome (les gènes) et le transcriptome (l'expression des gènes) de sept mélanomes métastatiques, effectués des analyses du nombre de copies par des puces à ADN et des caryotypes. Mes travaux ont permis le développement de nouvelles méthodes d'analyses adaptées au cancer, d'établir la liste des réseaux de signalisation cellulaire affectés de façon récurrente chez le mélanome et d'identifier deux cibles thérapeutiques potentielles jusqu'alors ignorées dans les cancers de la peau.
Resumo:
Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.
Resumo:
The establishment of legislative rules about explosives in the eighties has reduced the illicit use of military and civilian explosives. However, bomb-makers have rapidly taken advantage of substances easily accessible and intended for licit uses to produce their own explosives. This change in strategy has given rise to an increase of improvised explosive charges, which is moreover assisted by the ease of implementation of the recipes, widely available through open sources. While the nature of the explosive charges has evolved, instrumental methods currently used in routine, although more sensitive than before, have a limited power of discrimination and allow mostly the determination of the chemical nature of the substance. Isotope ratio mass spectrometry (IRMS) has been applied to a wide range of forensic materials. Conclusions drawn from the majority of the studies stress its high power of discrimination. Preliminary studies conducted so far on the isotopic analysis of intact explosives (pre-blast) have shown that samples with the same chemical composition and coming from different sources could be differentiated. The measurement of stable isotope ratios appears therefore as a new and remarkable analytical tool for the discrimination or the identification of a substance with a definite source. However, much research is still needed to assess the validity of the results in order to use them either in an operational prospect or in court. Through the isotopic study of black powders and ammonium nitrates, this research aims at evaluating the contribution of isotope ratio mass spectrometry to the investigation of explosives, both from a pre-blast and from a post-blast approach. More specifically, the goal of the research is to provide additional elements necessary to a valid interpretation of the results, when used in explosives investigation. This work includes a fundamental study on the variability of the isotopic profile of black powder and ammonium nitrate in both space and time. On one hand, the inter-variability between manufacturers and, particularly, the intra-variability within a manufacturer has been studied. On the other hand, the stability of the isotopic profile over time has been evaluated through the aging of these substances exposed to different environmental conditions. The second part of this project considers the applicability of this high-precision technology to traces and residues of explosives, taking account of the characteristics specific to the field, including their sampling, a probable isotopic fractionation during the explosion, and the interferences with the matrix of the site.
Resumo:
Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.