919 resultados para Forensic
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.
Resumo:
Puropse/Aim: To learn about the developement of post mortem CT angiography, its indications, benefits, pitfalls and practical application. Content Organization: A. Developement of post mortem CT angiography B. Technical prerequisites C. Practical application of post mortem CT angiography (preparation of the body, injection of contrast agent, examination protocol) D. Indications and benefits (including a comparison with conventional autopsy) E. Interpretation of imaging data (with case demonstrations) F. Artifacts, pitfalls and limitations G. Current and potential future use. Summary: This exhibit demonstrates the developement, application and interpretation of post mortem CT angiography. Teaching points: 1. post mortem CT angiography is feasible and useful for identification of the cause of death 2. depending on the indication it can be superior to autopsy 3. limitations and artifacts need to be known for interpreta
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.
Resumo:
In recent years, modern techniques of medical imaging such as MDCT (multidetector-computed tomography) and MRI (magnetic resonance imaging) have pioneered post mortem (pm) investigations, especially in forensic medicine. Particularly pm angiography permits investigating the vascular system in a way which is not possible by performing only conventional autopsy. Beside these radiological methods, other modem visualizing techniques like the three dimensional (3D) surface scan have been implemented in order perform reconstructions of complex cases. By the use of pm imaging techniques, more objective and accurate documentations can be realized that permit an increase of quality in forensic investigations.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
BACKGROUND: The World Anti-Doping Agency (WADA) is introducing enhancements to doping investigations in its 2015 Code, which include improved sharing of information between antidoping organisations (including sporting bodies) and enhanced accountability of athlete support staff. These additions will improve the control of links between sports doping and organised crime. In February 2013 the Australian Crime Commission released a report that linked several professional sporting codes, professional athletes with links to organised crime, performance enhancing drugs and illicit substances. Following this report the Australian Football League (AFL) partnered the Australian national antidoping organisation to investigate peptide use in Australian football. METHODS: This review compared the model proposed by Marclay, a hypothetical model for anti-doping investigations that proposed a forensic intelligence and analysis approach, to use the forensic capabilities of the AFL investigation to test the model's relevance to an actual case. RESULTS: The investigation uncovered the use of peptides used to enhance athlete performance. The AFL investigation found a high risk of doping where athlete support staff existed in teams with weak corporate governance controls. A further finding included the need for the investigation to provide a timely response in professional team sports that were sensitive to the competition timing. In the case of the AFL the team was sanctioned prior to the finals as an interim outcome for allowing the risk of use of performance-enhancing substances. Doping violation charges are still being considered. DISCUSSION: Antidoping strategies should include the investigation of corporate officers in team doping circumstances, the mandatory recording of all athlete substance use during competition and training phases, the wider sharing of forensic intelligence with non-sporting bodies particularly law enforcement and collaboration between antidoping and sporting organisations in doping investigations. CONCLUSIONS: The AFL investigation illustrated the importance of the 2015 WADA Code changes and highlighted the need for a systematic use of broad forensic intelligence activities in the investigation of doping violations.
Resumo:
The aims of this study were twofold. The first was to investigate the diagnostic performance of two biochemical markers, procalcitonin (PCT) and lipopolysaccharide-binding protein (LBP), considering each individually and then combined, for the postmortem diagnosis of sepsis. We also tested the usefulness of pericardial fluid for postmortem LBP determination. Two study groups were formed, a sepsis-related fatalities group of 12 cases and a control group of 30 cases. Postmortem native CT scans, autopsy, histology, neuropathology, and toxicology as well as other postmortem biochemical investigations were performed in all cases. Microbiological investigations were also carried out in the septic group. Postmortem serum PCT and LBP levels differed between the two groups. Both biomarkers, individually considered, allowed septic states to be diagnosed, whereas increases in both postmortem serum PCT and LBP levels were only observed in cases of sepsis. Similarly, normal PCT and LBP values in postmortem serum were identified only in non-septic cases. Pericardial fluid LBP levels do not correlate with the presence of underlying septic states. No relationship was observed between postmortem serum and pericardial fluid LBP levels in either septic or non-septic groups, or between pericardial fluid PCT and LBP levels.
Resumo:
Isopropyl alcohol (IPA) is widely used as an industrial solvent and cleaning fluid. After ingestion or absorption, IPA is converted into acetone by alcohol dehydrogenase. However, in ketosis, acetone can be reduced to IPA. The aim of this study was to investigate blood IPA and acetone concentrations in a series of 400 medico-legal autopsies, including cases of diabetic ketoacidosis, hypothermia and alcohol misuse-related deaths, to illustrate the extent of ketosis at the time of death. Vitreous glucose, blood 3-β-hydroxybutyrate (3HB) and acetoacetate (AcAc) concentrations were also determined systematically. Additionally, vitreous and urine IPA, acetone, 3HB and AcAc concentrations as well as other biochemical markers, including glycated hemoglobin and carbohydrate-deficient transferrin (CDT) were also determined in selected cases. The results of this study indicate that ketosis is characterized by the presence of IPA resulting from the acetone metabolism and that IPA can be detected in several substrates. These findings confirm the importance of the systematic determination of IPA and acetone levels that is used to quantify biochemical disturbances and the importance of ketosis at the time of death.
Resumo:
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.