317 resultados para Accuracy, fingerprint identification, forensic science
Resumo:
Forensic science is generally defined as the application of science to address questions related to the law. Too often, this view restricts the contribution of science to one single process which eventually aims at bringing individuals to court while minimising risk of miscarriage of justice. In order to go beyond this paradigm, we propose to refocus the attention towards traces themselves, as remnants of a criminal activity, and their information content. We postulate that traces contribute effectively to a wide variety of other informational processes that support decision making inmany situations. In particular, they inform actors of new policing strategies who place the treatment of information and intelligence at the centre of their systems. This contribution of forensic science to these security oriented models is still not well identified and captured. In order to create the best condition for the development of forensic intelligence, we suggest a framework that connects forensic science to intelligence-led policing (part I). Crime scene attendance and processing can be envisaged within this view. This approach gives indications abouthowto structure knowledge used by crime scene examiners in their effective practice (part II).
Resumo:
A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.
Resumo:
The present study was carried out to check whether classic osteometric parameters can be determined from the 3D reconstructions of MSCT (multislice computed tomography) scans acquired in the context of the Virtopsy project. To this end, four isolated and macerated skulls were examined by six examiners. First the skulls were conventionally (manually) measured using 32 internationally accepted linear measurements. Then the skulls were scanned by the use of MSCT with slice thicknesses of 1.25 mm and 0.63 mm, and the 33 measurements were virtually determined on the digital 3D reconstructions of the skulls. The results of the traditional and the digital measurements were compared for each examiner to figure out variations. Furthermore, several parameters were measured on the cranium and postcranium during an autopsy and compared to the values that had been measured on a 3D reconstruction from a previously acquired postmortem MSCT scan. The results indicate that equivalent osteometric values can be obtained from digital 3D reconstructions from MSCT scans using a slice thickness of 1.25 mm, and from conventional manual examinations. The measurements taken from a corpse during an autopsy could also be validated with the methods used for the digital 3D reconstructions in the context of the Virtopsy project. Future aims are the assessment and biostatistical evaluation in respect to sex, age and stature of all data sets stored in the Virtopsy project so far, as well as of future data sets. Furthermore, a definition of new parameters, only measurable with the aid of MSCT data would be conceivable.
Resumo:
Both, Bayesian networks and probabilistic evaluation are gaining more and more widespread use within many professional branches, including forensic science. Notwithstanding, they constitute subtle topics with definitional details that require careful study. While many sophisticated developments of probabilistic approaches to evaluation of forensic findings may readily be found in published literature, there remains a gap with respect to writings that focus on foundational aspects and on how these may be acquired by interested scientists new to these topics. This paper takes this as a starting point to report on the learning about Bayesian networks for likelihood ratio based, probabilistic inference procedures in a class of master students in forensic science. The presentation uses an example that relies on a casework scenario drawn from published literature, involving a questioned signature. A complicating aspect of that case study - proposed to students in a teaching scenario - is due to the need of considering multiple competing propositions, which is an outset that may not readily be approached within a likelihood ratio based framework without drawing attention to some additional technical details. Using generic Bayesian networks fragments from existing literature on the topic, course participants were able to track the probabilistic underpinnings of the proposed scenario correctly both in terms of likelihood ratios and of posterior probabilities. In addition, further study of the example by students allowed them to derive an alternative Bayesian network structure with a computational output that is equivalent to existing probabilistic solutions. This practical experience underlines the potential of Bayesian networks to support and clarify foundational principles of probabilistic procedures for forensic evaluation.
Resumo:
The aim of this study was to compare the diagnostic value of post-mortem computed tomography angiography (PMCTA) to conventional, ante-mortem computed tomography (CT)-scan, CT-angiography (CTA) and digital subtraction angiography (DSA) in the detection and localization of the source of bleeding in cases of acute hemorrhage with fatal outcomes. The medical records and imaging scans of nine individuals who underwent a conventional, ante-mortem CT-scan, CTA or DSA and later died in the hospital as a result of an acute hemorrhage were reviewed. Post-mortem computed tomography angiography, using multi-phase post-mortem CTA, as well as medico-legal autopsies were performed. Localization accuracy of the bleeding was assessed by comparing the diagnostic findings of the different techniques. The results revealed that data from ante-mortem and post-mortem radiological examinations were similar, though the PMCTA showed a higher sensitivity for detecting the hemorrhage source than did ante-mortem radiological investigations. By comparing the results of PMCTA and conventional autopsy, much higher sensitivity was noted in PMCTA in identifying the source of the bleeding. In fact, the vessels involved were identified in eight out of nine cases using PMCTA and only in three cases through conventional autopsy. Our study showed that PMCTA, similar to clinical radiological investigations, is able to precisely identify lesions of arterial and/or venous vessels and thus determine the source of bleeding in cases of acute hemorrhages with fatal outcomes.
Resumo:
The value of earmarks as an efficient means of personal identification is still subject to debate. It has been argued that the field is lacking a firm systematic and structured data basis to help practitioners to form their conclusions. Typically, there is a paucity of research guiding as to the selectivity of the features used in the comparison process between an earmark and reference earprints taken from an individual. This study proposes a system for the automatic comparison of earprints and earmarks, operating without any manual extraction of key-points or manual annotations. For each donor, a model is created using multiple reference prints, hence capturing the donor within source variability. For each comparison between a mark and a model, images are automatically aligned and a proximity score, based on a normalized 2D correlation coefficient, is calculated. Appropriate use of this score allows deriving a likelihood ratio that can be explored under known state of affairs (both in cases where it is known that the mark has been left by the donor that gave the model and conversely in cases when it is established that the mark originates from a different source). To assess the system performance, a first dataset containing 1229 donors elaborated during the FearID research project was used. Based on these data, for mark-to-print comparisons, the system performed with an equal error rate (EER) of 2.3% and about 88% of marks are found in the first 3 positions of a hitlist. When performing print-to-print transactions, results show an equal error rate of 0.5%. The system was then tested using real-case data obtained from police forces.
Resumo:
In the realm of forensic pathology, β-tryptase measurement for diagnostic purposes is performed in postmortem serum obtained from femoral blood. This may be partially or completely unavailable in some specific cases, such as infant autopsies and severely damaged bodies. The aim of this study was to investigate the usefulness of determining β-tryptase levels for diagnostic purposes in alternative biological samples. Urine, vitreous humor and pericardial fluid were selected and measured in 94 subjects including: fatal anaphylaxis following contrast material administration (6 cases), hypothermia (10 cases), diabetic ketoacidosis (10 cases), gunshot suicide (10 cases), heroin injection-related deaths (18 cases), trauma (10 cases), sudden death with minimal coronary atherosclerosis (10 cases), severe coronary atherosclerosis without myocardial infarction (10 cases) and severe coronary atherosclerosis with myocardial infarction (10 cases). Postmortem serum and pericardial fluid β-tryptase levels higher than the clinical reference value (11.4ng/ml) were systematically identified in fatal anaphylaxis following contrast material administration and 6 cases unrelated to anaphylaxis. β-tryptase concentrations in urine and vitreous humor were lower than the clinical reference value in all cases included in this study. Determination of β-tryptase in pericardial fluid appears to be a possible alternative to postmortem serum in the early postmortem period when femoral blood cannot be collected during autopsy and biochemical investigations are required to objectify increased β-tryptase levels.
Resumo:
Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.
Resumo:
Forensic scientists face increasingly complex inference problems for evaluating likelihood ratios (LRs) for an appropriate pair of propositions. Up to now, scientists and statisticians have derived LR formulae using an algebraic approach. However, this approach reaches its limits when addressing cases with an increasing number of variables and dependence relationships between these variables. In this study, we suggest using a graphical approach, based on the construction of Bayesian networks (BNs). We first construct a BN that captures the problem, and then deduce the expression for calculating the LR from this model to compare it with existing LR formulae. We illustrate this idea by applying it to the evaluation of an activity level LR in the context of the two-trace transfer problem. Our approach allows us to relax assumptions made in previous LR developments, produce a new LR formula for the two-trace transfer problem and generalize this scenario to n traces.
Resumo:
Situating events and traces in time is an essential problem in investigations. To date, among the typical ques- 21¦tions issued in forensic science, time has generally been unexplored. The reason for this can be traced to the 22¦complexity of the overall problem, addressed by several scientists in very limited projects usually stimulated 23¦by a specific case. Considering that such issues are recurrent and transcending the treatment of each trace 24¦separately, the formalisation of a framework to address dating issues in criminal investigation is undeniably 25¦needed. Through an iterative process consisting of extracting recurrent aspects discovered from the study of 26¦problems encountered by practitioners and reported in the literature, common mechanisms were extracted 27¦and provide understanding of underlying factors encountered in forensic practise. Three complementary ap- 28¦proaches are thus highlighted and described to formalise a preliminary framework that can be applied for the 29¦dating of traces, objects, persons and indirectly events.
Resumo:
False identity documents represent a serious threat through their production and use in organized crime and by terrorist organizations. The present-day fight against this criminal problem and threats to national security does not appropriately address the organized nature of this criminal activity, treating each fraudulent document on its own during investigation and the judicial process, which causes linkage blindness and restrains the analysis capacity. Given the drawbacks of this case-by-case approach, this article proposes an original model in which false identity documents are used to inform a systematic forensic intelligence process. The process aims to detect links, patterns, and tendencies among false identity documents in order to support strategic and tactical decision making, thus sustaining a proactive intelligence-led approach to fighting identity document fraud and the associated organized criminality. This article formalizes both the model and the process, using practical applications to illustrate its powerful capabilities. This model has a general application and can be transposed to other fields of forensic science facing similar difficulties.
Resumo:
The production and use of false identity and travel documents in organized crime represent a serious and evolving threat. However, a case-by-case perspective, thus suffering from linkage blindness and a limited analysis capacity, essentially drives the present-day fight against this criminal problem. To assist in overcoming these limitations, a process model was developed using a forensic perspective. It guides the systematic analysis and management of seized false documents to generate forensic intelligence that supports strategic and tactical decision-making in an intelligence-led policing approach. The model is articulated on a three-level architecture that aims to assist in detecting and following-up on general trends, production methods and links between cases or series. Using analyses of a large dataset of counterfeit and forged identity and travel documents, it is possible to illustrate the model, its three levels and their contribution. Examples will point out how the proposed approach assists in detecting emerging trends, in evaluating the black market's degree of structure, in uncovering criminal networks, in monitoring the quality of false documents, and in identifying their weaknesses to orient the conception of more secured travel and identity documents. The process model proposed is thought to have a general application in forensic science and can readily be transposed to other fields of study.
Resumo:
Recent years have been characterized by a series of publications in the field of firearms investigation questioning the reliability and objectivity of such examination. This research investigates new solutions to decrease the subjective component affecting the evaluation that follows the comparison of impressions left by a firearm on the surface of spent cartridge cases. An automatic comparison system based on 3D measurements has been developed and coupled to a bivariate evaluative model allowing assigning likelihood ratios. Based on a dataset of 79 pistols (all SIG Sauer 9 mm Luger caliber), the system shows a very high discriminating power and the LRs that it provides are very indicative of the true state under both the prosecution and the defense propositions. For example, likelihood ratios exceeding a billion are predominantly obtained when impressions originating from the same source are compared. The system is also characterized by relatively low rates (≤1%) of misleading evidence depending on the firearm considered.