968 resultados para Forensic Science
Resumo:
Résumé Ce travail de thèse étudie des moyens de formalisation permettant d'assister l'expert forensique dans la gestion des facteurs influençant l'évaluation des indices scientifiques, tout en respectant des procédures d'inférence établies et acceptables. Selon une vue préconisée par une partie majoritaire de la littérature forensique et juridique - adoptée ici sans réserve comme point de départ - la conceptualisation d'une procédure évaluative est dite 'cohérente' lors qu'elle repose sur une implémentation systématique de la théorie des probabilités. Souvent, par contre, la mise en oeuvre du raisonnement probabiliste ne découle pas de manière automatique et peut se heurter à des problèmes de complexité, dus, par exemple, à des connaissances limitées du domaine en question ou encore au nombre important de facteurs pouvant entrer en ligne de compte. En vue de gérer ce genre de complications, le présent travail propose d'investiguer une formalisation de la théorie des probabilités au moyen d'un environment graphique, connu sous le nom de Réseaux bayesiens (Bayesian networks). L'hypothèse principale que cette recherche envisage d'examiner considère que les Réseaux bayesiens, en concert avec certains concepts accessoires (tels que des analyses qualitatives et de sensitivité), constituent une ressource clé dont dispose l'expert forensique pour approcher des problèmes d'inférence de manière cohérente, tant sur un plan conceptuel que pratique. De cette hypothèse de travail, des problèmes individuels ont été extraits, articulés et abordés dans une série de recherches distinctes, mais interconnectées, et dont les résultats - publiés dans des revues à comité de lecture - sont présentés sous forme d'annexes. D'un point de vue général, ce travail apporte trois catégories de résultats. Un premier groupe de résultats met en évidence, sur la base de nombreux exemples touchant à des domaines forensiques divers, l'adéquation en termes de compatibilité et complémentarité entre des modèles de Réseaux bayesiens et des procédures d'évaluation probabilistes existantes. Sur la base de ces indications, les deux autres catégories de résultats montrent, respectivement, que les Réseaux bayesiens permettent également d'aborder des domaines auparavant largement inexplorés d'un point de vue probabiliste et que la disponibilité de données numériques dites 'dures' n'est pas une condition indispensable pour permettre l'implémentation des approches proposées dans ce travail. Le présent ouvrage discute ces résultats par rapport à la littérature actuelle et conclut en proposant les Réseaux bayesiens comme moyen d'explorer des nouvelles voies de recherche, telles que l'étude de diverses formes de combinaison d'indices ainsi que l'analyse de la prise de décision. Pour ce dernier aspect, l'évaluation des probabilités constitue, dans la façon dont elle est préconisée dans ce travail, une étape préliminaire fondamentale de même qu'un moyen opérationnel.
Resumo:
The aim of this work is to study the influence of several analytical parameters on the variability of Raman spectra of paint samples. In the present study, microtome thin section and direct (no preparation) analysis are considered as sample preparation. In order to evaluate their influence on the measures, an experimental design such as 'fractional full factorial' with seven factors (including the sampling process) is applied, for a total of 32 experiments representing 160 measures. Once the influence of sample preparation highlighted, a depth profile of a paint sample is carried out by changing the focusing plane in order to measure the colored layer under a clearcoat. This is undertaken in order to avoid sample preparation such a microtome sectioning. Finally, chemometric treatments such as principal component analysis are applied to the resulting spectra. The findings of this study indicate the importance of sample preparation, or more specifically, the surface roughness, on the variability of the measurements on a same sample. Moreover, the depth profile experiment highlights the influence of the refractive index of the upper layer (clearcoat) when measuring through a transparent layer.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
Isotope ratio mass spectrometry (IRMS) has recently made its appearance in the forensic community. This high-precision technology has already been applied to a broad range of forensic fields such as illicit drugs, explosives and flammable liquids, where current, routinely used techniques have limited powers of discrimination. The conclusions drawn from the majority of these IRMS studies appear to be very promising. Used in a comparative process, as in food or drug authentication, the measurement of stable isotope ratios is a new and remarkable analytical tool for the discrimination or the identification of a substance with a definite source or origin. However, the research consists mostly of preliminary studies. The significance of this 'new' piece of information needs to be evaluated in light of a forensic framework to assess the actual potential and validity of IRMS, considering the characteristics of each field. Through the isotopic study of black powder, this paper aims at illustrating the potential of the method and the limitations of current knowledge in stable isotopes when facing forensic problems.
Resumo:
Heretofore the issue of quality in forensic science is approached through a quality management policy whose tenets are ruled by market forces. Despite some obvious advantages of standardization of methods allowing interlaboratory comparisons and implementation of databases, this approach suffers from a serious lack of consideration for forensic science as a science. A critical study of its principles and foundations, which constitutes its culture, enables to consider the matter of scientific quality through a new dimension. A better understanding of what pertains to forensic science ensures a better application and improves elementary actions within the investigative and intelligence processes as well as the judicial process. This leads to focus the attention on the core of the subject matter: the physical remnants of the criminal activity, namely, the traces that produce information in understanding this activity. Adapting practices to the detection and recognition of relevant traces relies on the apprehension of the processes underlying forensic science tenets (Locard, Kirk, relevancy issue) and a structured management of circumstantial information (directindirect information). This is influenced by forensic science education and training. However, the lack of homogeneity with regard to the scientific nature and culture of the discipline within forensic science practitioners and partners represents a real challenge. A sound and critical reconsideration of the forensic science practitioner's roles (investigator, evaluator, intelligence provider) and objectives (prevention, strategies, evidence provider) within the criminal justice system is a means to strengthen the understanding and the application of forensic science. Indeed, the whole philosophy is aimed at ensuring a high degree of excellence, namely, a dedicated scientific quality.
Resumo:
Isotope ratio mass spectrometry (IRMS) has been used in numerous fields of forensic science in a source inference perspective. This review compiles the studies published on the application of isotope ratio mass spectrometry (IRMS) to the traditional fields of forensic science so far. It completes the review of Benson et al. [1] and synthesises the extent of knowledge already gathered in the following fields: illicit drugs, flammable liquids, human provenancing, microtraces, explosives and other specific materials (packaging tapes, safety matches, plastics, etc.). For each field, a discussion assesses the state of science and highlights the relevance of the information in a forensic context. Through the different discussions which mark out the review, the potential and limitations of IRMS, as well as the needs and challenges of future studies are emphasized. The paper elicits the various dimensions of the source which can be obtained from the isotope information and demonstrates the transversal nature of IRMS as a tool for source inference.
Resumo:
In this commentary, we argue that the term 'prediction' is overly used when in fact, referring to foundational writings of de Finetti, the correspondent term should be inference. In particular, we intend (i) to summarize and clarify relevant subject matter on prediction from established statistical theory, and (ii) point out the logic of this understanding with respect practical uses of the term prediction. Written from an interdisciplinary perspective, associating statistics and forensic science as an example, this discussion also connects to related fields such as medical diagnosis and other areas of application where reasoning based on scientific results is practiced in societal relevant contexts. This includes forensic psychology that uses prediction as part of its vocabulary when dealing with matters that arise in the course of legal proceedings.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
This paper is concerned with the contribution of forensic science to the legal process by helping reduce uncertainty. Although it is now widely accepted that uncertainty should be handled by probability because it is a safeguard against incoherent proceedings, there remain diverging and conflicting views on how probability ought to be interpreted. This is exemplified by the proposals in scientific literature that call for procedures of probability computation that are referred to as "objective," suggesting that scientists ought to use them in their reporting to recipients of expert information. I find such proposals objectionable. They need to be viewed cautiously, essentially because ensuing probabilistic statements can be perceived as making forensic science prescriptive. A motivating example from the context of forensic DNA analysis will be chosen to illustrate this. As a main point, it shall be argued that such constraining suggestions can be avoided by interpreting probability as a measure of personal belief, that is, subjective probability. Invoking references to foundational literature from mathematical statistics and philosophy of science, the discussion will explore the consequences of this interdisciplinary viewpoint for the practice of forensic expert reporting. It will be emphasized that-as an operational interpretation of probability-the subjectivist perspective enables forensic science to add value to the legal process, in particular by avoiding inferential impasses to which other interpretations of probability may lead. Moreover, understanding probability from a subjective perspective can encourage participants in the legal process to take on more responsibility in matters regarding the coherent handling of uncertainty. This would assure more balanced interactions at the interface between science and the law. This, in turn, provides support for ongoing developments that can be called the "probabilization" of forensic science.
Resumo:
Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.
Resumo:
The Differential Scanning Calorimetry (DSC) was used to study the thermal behavior of hair samples and to verify the possibility of identifying an individual based on DSC curves from a data bank. Hair samples of students and officials from Instituto de Química de Araraquara, UNESP were obtained to build up a data bank. Thus to sought an individual, under incognito participant of this data bank, was identified using DSC curves.