151 resultados para Computer forensic analysis
Resumo:
Ethyl glucuronide (EtG) is a minor and direct metabolite of ethanol. EtG is incorporated into the growing hair allowing retrospective investigation of chronic alcohol abuse. In this study, we report the development and the validation of a method using gas chromatography-negative chemical ionization tandem mass spectrometry (GC-NCI-MS/MS) for the quantification of EtG in hair. EtG was extracted from about 30 mg of hair by aqueous incubation and purified by solid-phase extraction (SPE) using mixed mode extraction cartridges followed by derivation with perfluoropentanoic anhydride (PFPA). The analysis was performed in the selected reaction monitoring (SRM) mode using the transitions m/z 347-->163 (for the quantification) and m/z 347-->119 (for the identification) for EtG, and m/z 352-->163 for EtG-d(5) used as internal standard. For validation, we prepared quality controls (QC) using hair samples taken post mortem from 2 subjects with a known history of alcoholism. These samples were confirmed by a proficiency test with 7 participating laboratories. The assay linearity of EtG was confirmed over the range from 8.4 to 259.4 pg/mg hair, with a coefficient of determination (r(2)) above 0.999. The limit of detection (LOD) was estimated with 3.0 pg/mg. The lower limit of quantification (LLOQ) of the method was fixed at 8.4 pg/mg. Repeatability and intermediate precision (relative standard deviation, RSD%), tested at 4 QC levels, were less than 13.2%. The analytical method was applied to several hair samples obtained from autopsy cases with a history of alcoholism and/or lesions caused by alcohol. EtG concentrations in hair ranged from 60 to 820 pg/mg hair.
Resumo:
BACKGROUND AND METHODS:: The objectives of this article were to systematically describe and examine the novel roles and responsibilities assumed by nurses in a forensic consultation for victims of violence at a University Hospital in French-speaking Switzerland. Utilizing a case study methodology, information was collected from two main sources: (a) discussion groups with nurses and forensic pathologists and (b) a review of procedures and protocols. Following a critical content analysis, the roles and responsibilities of the forensic nurses were described and compared with the seven core competencies of advanced nursing practice as outlined by Hamric, Spross, and Hanson (2009). RESULTS:: Advanced nursing practice competencies noted in the analysis included "direct clinical practice," "coaching and guidance," and "collaboration." The role of the nurse in terms of "consultation," "leadership," "ethics," and "research" was less evident in the analysis. DISCUSSION AND CONCLUSION:: New forms of nursing are indeed practiced in the forensic clinical setting, and our findings suggest that nursing practice in this domain is following the footprints of an advanced nursing practice model. Further reflections are required to determine whether the role of the forensic nurse in Switzerland should be developed as a clinical nurse specialist or that of a nurse practitioner.
Resumo:
BACKGROUND: Findings from randomised trials have shown a higher early risk of stroke after carotid artery stenting than after carotid endarterectomy. We assessed whether white-matter lesions affect the perioperative risk of stroke in patients treated with carotid artery stenting versus carotid endarterectomy. METHODS: Patients with symptomatic carotid artery stenosis included in the International Carotid Stenting Study (ICSS) were randomly allocated to receive carotid artery stenting or carotid endarterectomy. Copies of baseline brain imaging were analysed by two investigators, who were masked to treatment, for the severity of white-matter lesions using the age-related white-matter changes (ARWMC) score. Randomisation was done with a computer-generated sequence (1:1). Patients were divided into two groups using the median ARWMC. We analysed the risk of stroke within 30 days of revascularisation using a per-protocol analysis. ICSS is registered with controlled-trials.com, number ISRCTN 25337470. FINDINGS: 1036 patients (536 randomly allocated to carotid artery stenting, 500 to carotid endarterectomy) had baseline imaging available. Median ARWMC score was 7, and patients were dichotomised into those with a score of 7 or more and those with a score of less than 7. In patients treated with carotid artery stenting, those with an ARWMC score of 7 or more had an increased risk of stroke compared with those with a score of less than 7 (HR for any stroke 2·76, 95% CI 1·17-6·51; p=0·021; HR for non-disabling stroke 3·00, 1·10-8·36; p=0·031), but we did not see a similar association in patients treated with carotid endarterectomy (HR for any stroke 1·18, 0·40-3·55; p=0·76; HR for disabling or fatal stroke 1·41, 0·38-5·26; p=0·607). Carotid artery stenting was associated with a higher risk of stroke compared with carotid endarterectomy in patients with an ARWMC score of 7 or more (HR for any stroke 2·98, 1·29-6·93; p=0·011; HR for non-disabling stroke 6·34, 1·45-27·71; p=0·014), but there was no risk difference in patients with an ARWMC score of less than 7. INTERPRETATION: The presence of white-matter lesions on brain imaging should be taken into account when selecting patients for carotid revascularisation. Carotid artery stenting should be avoided in patients with more extensive white-matter lesions, but might be an acceptable alternative to carotid endarterectomy in patients with less extensive lesions. FUNDING: Medical Research Council, the Stroke Association, Sanofi-Synthélabo, the European Union Research Framework Programme 5.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.
Resumo:
A sensitive method was developed for quantifying a wide range of cannabinoids in oral fluid (OF) by liquid chromatography-tandem mass spectrometry (LC-MS/MS). These cannabinoids include a dagger(9)-tetrahydrocannabinol (THC), 11-hydroxy-a dagger(9)-tetrahydrocannabinol (11-OH-THC), 11-nor-9-carboxy-a dagger(9)-tetrahydrocannabinol (THCCOOH), cannabinol (CBN), cannabidiol (CBD), a dagger(9)-tetrahydrocannabinolic acid A (THC-A), 11-nor-9-carboxy-a dagger(9)-tetrahydrocannabinol glucuronide (THCCOOH-gluc), and a dagger(9)-tetrahydrocannabinol glucuronide (THC-gluc). Samples were collected using a Quantisal (TM) device. The advantages of performing a liquid-liquid extraction (LLE) of KCl-saturated OF using heptane/ethyl acetate versus a solid-phase extraction (SPE) using HLB copolymer columns were determined. Chromatographic separation was achieved in 11.5 min on a Kinetex (TM) column packed with 2.6-mu m core-shell particles. Both positive (THC, 11-OH-THC, CBN, and CBD) and negative (THCCOOH, THC-gluc, THCCOOH-gluc, and THC-A) electrospray ionization modes were employed with multiple reaction monitoring using a high-end AB Sciex API 5000 (TM) triple quadrupole LC-MS/MS system. Unlike SPE, LLE failed to extract THC-gluc and THCCOOH-gluc. However, the LLE method was more sensitive for the detection of THCCOOH than the SPE method, wherein the limit of detection (LOD) and limit of quantification (LOQ) decreased from 100 to 50 pg/ml and from 500 to 80 pg/ml, respectively. The two extraction methods were successfully applied to OF samples collected from volunteers before and after they smoked a homemade cannabis joint. High levels of THC were measured soon after smoking, in addition to significant amounts of THC-A. Other cannabinoids were found in low concentrations. Glucuronide conjugate levels were lower than the method's LOD for most samples. Incubation studies suggest that glucuronides could be enzymatically degraded by glucuronidase prior to OF collection
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides presented by class I major histocompatibility complexes (MHCs) is the determining event in the specific cellular immune response against virus-infected cells or tumor cells. It is of great interest, therefore, to elucidate the molecular principles upon which the selectivity of a TCR is based. These principles can in turn be used to design therapeutic approaches, such as peptide-based immunotherapies of cancer. In this study, free energy simulation methods are used to analyze the binding free energy difference of a particular TCR (A6) for a wild-type peptide (Tax) and a mutant peptide (Tax P6A), both presented in HLA A2. The computed free energy difference is 2.9 kcal/mol, in good agreement with the experimental value. This makes possible the use of the simulation results for obtaining an understanding of the origin of the free energy difference which was not available from the experimental results. A free energy component analysis makes possible the decomposition of the free energy difference between the binding of the wild-type and mutant peptide into its components. Of particular interest is the fact that better solvation of the mutant peptide when bound to the MHC molecule is an important contribution to the greater affinity of the TCR for the latter. The results make possible identification of the residues of the TCR which are important for the selectivity. This provides an understanding of the molecular principles that govern the recognition. The possibility of using free energy simulations in designing peptide derivatives for cancer immunotherapy is briefly discussed.
Resumo:
ABSTRACT Adult neuronal plasticity is a term that corresponds to a set of biological mechanisms allowing a neuronal circuit to respond and adapt to modifications of the received inputs. Mystacial whiskers of the mouse are the starting point of a major sensory pathway that provides the animal with information from its immediate environment. Through whisking, information is gathered that allows the animal to orientate itself and to recognize objects. This sensory system is crucial for nocturnal behaviour during which vision is not of much use. Sensory information of the whiskers are sent via brainstem and thalamus to the primary somatosensory area (S1) of the cerebral cortex in a strictly topological manner. Cell bodies in the layer N of S 1 are arranged in ring forming structures called barrels. As such, each barrel corresponds to the cortical representation in layer IV of a single whisker follicle. This histological feature allows to identify with uttermost precision the part of the cortex devoted to a given whisker and to study modifications induced by different experimental conditions. The condition used in the studies of my thesis is the passive stimulation of one whisker in the adult mouse for a period of 24 hours. It is performed by glueing a piece of metal on one whisker and placing the awake animal in a cage surrounded by an electromagnetic coil that generates magnetic field burst inducing whisker movement at a given frequency during 24 hours. I analysed the ultrastructure of the barrel corresponding the stimulated whisker using serial sections electron microscopy and computer-based three-dimensional reconstructions; analysis of neighbouring, unstimulated barrels as well as those from unstimulated mice served as control. The following elements were structurally analyzed: the spiny dendrites, the axons of excitatory as well as inhibitory cells, their connections via synapses and the astrocytic processes. The density of synapses and spines is upregulated in a barrel corresponding to a stimulated whisker. This upregulation is absent in the BDNF heterozygote mice, indicating that a certain level of activity-dependent released BDNF is required for synaptogenesis in the adult cerebral cortex. Synpaptogenesis is correlated with a modification of the astrocytes that place themselves in closer vicinity of the excitatory synapses on spines. Biochemical analysis revealed that the astrocytes upregulate the expression of transporters by which they internalise glutamate, the neurotransmitter responsible for the excitatory response of cortical neurons. In the final part of my thesis, I show that synaptogenesis in the stimulated barrel is due to the increase in the size of excitatory axonal boutons that become more frequently multisynaptic, whereas the inhibitory axons do not change their morphology but form more synapses with spines apposed to them. Taken together, my thesis demonstrates that all the cellular elements present in the neuronal tissue of the adult brain contribute to activity-dependent cortical plasticity and form part of a mechanism by which the animal responds to a modified sensory experience. Throughout life, the neuronal circuit keeps the faculty to adapt its function. These adaptations are partially transitory but some aspects remain and could be the structural basis of a memory trace in the cortical circuit. RESUME La plasticité neuronale chez l'adulte désigne un ensemble de mécanismes biologiques qui permettent aux circuits neuronaux de répondre et de s'adapter aux modifications des stimulations reçues. Les vibrisses des souris sont un système crucial fournissant des informations sensorielles au sujet de l'environnement de l'animal. L'information sensorielle collectée par les vibrisses est envoyée via le tronc cérébral et le thalamus à l'aire sensorielle primaire (S 1) du cortex cérébral en respectant strictement la somatotopie. Les corps cellulaires dans la couche IV de S 1 sont organisés en anneaux délimitant des structures nommées tonneaux. Chaque tonneau reçoit l'information d'une seule vibrisse et l'arrangement des tonneaux dans le cortex correspond à l'arrangement des vibrisses sur le museau de la souris. Cette particularité histologique permet de sélectionner avec certitude la partie du cortex dévolue à une vibrisse et de l'étudier dans diverses conditions. Le paradigme expérimental utilisé dans cette thèse est la stimulation passive d'une seule vibrisse durant 24 heures. Pour ce faire, un petit morceau de métal est collé sur une vibrisse et la souris est placée dans une cage entourée d'une bobine électromagnétique générant un champ qui fait vibrer le morceau de métal durant 24 heures. Nous analysons l'ultrastructure du cortex cérébral à l'aide de la microscopie électronique et des coupes sériées permettant la reconstruction tridimensionnelle à l'aide de logiciels informatiques. Nous observons les modifications des structures présentes : les dendrites épineuses, les axones des cellules excitatrices et inhibitrices, leurs connections par des synapses et les astrocytes. Le nombre de synapses et d'épines est augmenté dans un tonneau correspondant à une vibrisse stimulée 24 heures. Basé sur cela, nous montrons dans ces travaux que cette réponse n'est pas observée dans des souris hétérozygotes BDNF+/-. Cette neurotrophine sécrétée en fonction de l'activité neuronale est donc nécessaire pour la synaptogenèse. La synaptogenèse est accompagnée d'une modification des astrocytes qui se rapprochent des synapses excitatrices au niveau des épines dendritiques. Ils expriment également plus de transporteurs chargés d'internaliser le glutamate, le neurotransmetteur responsable de la réponse excitatrice des neurones. Nous montrons aussi que les axones excitateurs deviennent plus larges et forment plus de boutons multi-synaptiques à la suite de la stimulation tandis que les axones inhibiteurs ne changent pas de morphologie mais forment plus de synapses avec des épines apposées à leur membrane. Tous les éléments analysés dans le cerveau adulte ont maintenu la capacité de réagir aux modifications de l'activité neuronale et répondent aux modifications de l'activité permettant une constante adaptation à de nouveaux environnements durant la vie. Les circuits neuronaux gardent la capacité de créer de nouvelles synapses. Ces adaptations peuvent être des réponses transitoires aux stimuli mais peuvent aussi laisser une trace mnésique dans les circuits.
Resumo:
OBJECTIVES: Many patients may believe that HIV screening is included in routine preoperative work-ups. We examined what proportion of patients undergoing preoperative blood testing believed that they had been tested for HIV. METHODS: All patients hospitalized for elective orthopaedic surgery between January and December 2007 were contacted and asked to participate in a 15-min computer-assisted telephone interview (n = 1330). The primary outcome was to determine which preoperative tests patients believed had been performed from a choice of glucose, clotting, HIV serology and cholesterol, and what percentage of patients interpreted the lack of result communication as a normal or negative test. The proportion of patients agreeable to HIV screening prior to future surgery was also determined. RESULTS: A total of 991 patients (75%) completed the questionnaire. Three hundred and seventy-five of these 991 patients (38%) believed incorrectly that they had been tested for HIV preoperatively. Younger patients were significantly more likely to believe that an HIV test had been performed (mean age 46 vs. 50 years for those who did not believe that an HIV test had been performed; P < 0.0001). Of the patients who believed that a test had been performed but received no result, 96% interpreted lack of a result as a negative HIV test. Over 80% of patients surveyed stated that they would agree to routine HIV screening prior to future surgery. A higher acceptance rate was associated with younger age (mean age 47 years for those who would agree vs. 56 years for those who would not; P < 0.0001) and male sex ( P < 0.009). CONCLUSIONS: Many patients believe that a preoperative blood test routinely screens for HIV. The incorrect assumption that a lack of result communication indicates a negative test may contribute to delays in HIV diagnoses.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
False identity documents represent a serious threat through their production and use in organized crime and by terrorist organizations. The present-day fight against this criminal problem and threats to national security does not appropriately address the organized nature of this criminal activity, treating each fraudulent document on its own during investigation and the judicial process, which causes linkage blindness and restrains the analysis capacity. Given the drawbacks of this case-by-case approach, this article proposes an original model in which false identity documents are used to inform a systematic forensic intelligence process. The process aims to detect links, patterns, and tendencies among false identity documents in order to support strategic and tactical decision making, thus sustaining a proactive intelligence-led approach to fighting identity document fraud and the associated organized criminality. This article formalizes both the model and the process, using practical applications to illustrate its powerful capabilities. This model has a general application and can be transposed to other fields of forensic science facing similar difficulties.
Resumo:
The production and use of false identity and travel documents in organized crime represent a serious and evolving threat. However, a case-by-case perspective, thus suffering from linkage blindness and a limited analysis capacity, essentially drives the present-day fight against this criminal problem. To assist in overcoming these limitations, a process model was developed using a forensic perspective. It guides the systematic analysis and management of seized false documents to generate forensic intelligence that supports strategic and tactical decision-making in an intelligence-led policing approach. The model is articulated on a three-level architecture that aims to assist in detecting and following-up on general trends, production methods and links between cases or series. Using analyses of a large dataset of counterfeit and forged identity and travel documents, it is possible to illustrate the model, its three levels and their contribution. Examples will point out how the proposed approach assists in detecting emerging trends, in evaluating the black market's degree of structure, in uncovering criminal networks, in monitoring the quality of false documents, and in identifying their weaknesses to orient the conception of more secured travel and identity documents. The process model proposed is thought to have a general application in forensic science and can readily be transposed to other fields of study.
Resumo:
Forensic intelligence is a distinct dimension of forensic science. Forensic intelligence processes have mostly been developed to address either a specific type of trace or a specific problem. Even though these empirical developments have led to successes, they are trace-specific in nature and contribute to the generation of silos which hamper the establishment of a more general and transversal model. Forensic intelligence has shown some important perspectives but more general developments are required to address persistent challenges. This will ensure the progress of the discipline as well as its widespread implementation in the future. This paper demonstrates that the description of forensic intelligence processes, their architectures, and the methods for building them can, at a certain level, be abstracted from the type of traces considered. A comparative analysis is made between two forensic intelligence approaches developed independently in Australia and in Europe regarding the monitoring of apparently very different kind of problems: illicit drugs and false identity documents. An inductive effort is pursued to identify similarities and to outline a general model. Besides breaking barriers between apparently separate fields of study in forensic science and intelligence, this transversal model would assist in defining forensic intelligence, its role and place in policing, and in identifying its contributions and limitations. The model will facilitate the paradigm shift from the current case-by-case reactive attitude towards a proactive approach by serving as a guideline for the use of forensic case data in an intelligence-led perspective. A follow-up article will specifically address issues related to comparison processes, decision points and organisational issues regarding forensic intelligence (part II).
Resumo:
Background: In order to provide a cost-effective tool to analyse pharmacogenetic markers in malaria treatment, DNA microarray technology was compared with sequencing of polymerase chain reaction (PCR) fragments to detect single nucleotide polymorphisms (SNPs) in a larger number of samples. Methods: The microarray was developed to affordably generate SNP data of genes encoding the human cytochrome P450 enzyme family (CYP) and N-acetyltransferase-2 (NAT2) involved in antimalarial drug metabolisms and with known polymorphisms, i.e. CYP2A6, CYP2B6, CYP2C8, CYP2C9, CYP2C19, CYP2D6, CYP3A4, CYP3A5, and NAT2. Results: For some SNPs, i.e. CYP2A6*2, CYP2B6*5, CYP2C8*3, CYP2C9*3/*5, CYP2C19*3, CYP2D6*4 and NAT2*6/*7/*14, agreement between both techniques ranged from substantial to almost perfect (kappa index between 0.61 and 1.00), whilst for other SNPs a large variability from slight to substantial agreement (kappa index between 0.39 and 1.00) was found, e. g. CYP2D6*17 (2850C>T), CYP3A4*1B and CYP3A5*3. Conclusion: The major limit of the microarray technology for this purpose was lack of robustness and with a large number of missing data or with incorrect specificity.
Resumo:
The flexibility of different regions of HIV-1 protease was examined by using a database consisting of 73 X-ray structures that differ in terms of sequence, ligands or both. The root-mean-square differences of the backbone for the set of structures were shown to have the same variation with residue number as those obtained from molecular dynamics simulations, normal mode analyses and X-ray B-factors. This supports the idea that observed structural changes provide a measure of the inherent flexibility of the protein, although specific interactions between the protease and the ligand play a secondary role. The results suggest that the potential energy surface of the HIV-1 protease is characterized by many local minima with small energetic differences, some of which are sampled by the different X-ray structures of the HIV-1 protease complexes. Interdomain correlated motions were calculated from the structural fluctuations and the results were also in agreement with molecular dynamics simulations and normal mode analyses. Implications of the results for the drug-resistance engendered by mutations are discussed briefly.