979 resultados para forensic computer examination
Resumo:
This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology.
Resumo:
This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
Multislice-computed tomography (MSCT) and magnetic resonance imaging (MRI) are increasingly used for forensic purposes. Based on broad experience in clinical neuroimaging, post-mortem MSCT and MRI were performed in 57 forensic cases with the goal to evaluate the radiological methods concerning their usability for forensic head and brain examination. An experienced clinical radiologist evaluated the imaging data. The results were compared to the autopsy findings that served as the gold standard with regard to common forensic neurotrauma findings such as skull fractures, soft tissue lesions of the scalp, various forms of intracranial hemorrhage or signs of increased brain pressure. The sensitivity of the imaging methods ranged from 100% (e.g., heat-induced alterations, intracranial gas) to zero (e.g., mediobasal impression marks as a sign of increased brain pressure, plaques jaunes). The agreement between MRI and CT was 69%. The radiological methods prevalently failed in the detection of lesions smaller than 3mm of size, whereas they were generally satisfactory concerning the evaluation of intracranial hemorrhage. Due to its advanced 2D and 3D post-processing possibilities, CT in particular possessed certain advantages in comparison with autopsy with regard to forensic reconstruction. MRI showed forensically relevant findings not seen during autopsy in several cases. The partly limited sensitivity of imaging that was observed in this retrospective study was based on several factors: besides general technical limitations it became apparent that clinical radiologists require a sound basic forensic background in order to detect specific signs. Focused teaching sessions will be essential to improve the outcome in future examinations. On the other hand, the autopsy protocols should be further standardized to allow an exact comparison of imaging and autopsy data. In consideration of these facts, MRI and CT have the power to play an important role in future forensic neuropathological examination.
Resumo:
Based on only one objective and several subjective signs, the forensic classification of strangulation incidents concerning their life-threatening quality can be problematic. Reflecting that it is almost impossible to detect internal injuries of the neck with the standard forensic external examination, we examined 14 persons who have survived manual and ligature strangulation or forearm choke holds using MRI technique (1.5-T scanner). Two clinical radiologists evaluated the neck findings independently. The danger to life was evaluated based on the "classical" external findings alone and in addition to the radiological data. We observed hemorrhaging in the subcutaneous fatty tissue of the neck in ten cases. Other frequent findings were hemorrhages of the neck and larynx muscles, the lymph nodes, the pharynx, and larynx soft tissues. Based on the classical forensic strangulation findings with MRI, eight of the cases were declared as life-endangering incidents, four of them without the presence of petechial hemorrhage but with further signs of impaired brain function due to hypoxia. The accuracy of future forensic classification of the danger to life will probably be increased when it is based not only on one objective and several subjective signs but also on the evidence of inner neck injuries. However, further prospective studies including larger cohorts are necessary to clarify the value of the inner neck injuries in the forensic classification of surviving strangulation victims.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
During their career, forensic document examiners will inevitably be confronted with handwriting carried out under unusual conditions (UnC). A questioned document signed on top of a car or on a vertical surface like a wall are two examples. These atypical circumstances may give rise to more variability of the signatures or written words, in particular if the body was in non-traditional writing position without the traditional support. Few studies were devoted to handwriting variability under unusual writing conditions. The current study investigates whether individual variability changes with special writing conditions. In a previous study (Sciacca & al, 2009), we found that eight repetitions were sufficient to obtain a correct estimation of the variance. In the present study, twelve subjects were asked to write two word sets eight times in upper and eight times in lower case, under different conditions : sitting and writing on a horizontal (usual condition UC) or vertical support; and standing, kneeling or laying while writing on a horizontal or vertical support (unusual conditions UnC). Words were written on a pen tablet, normalized in space and time and then averaged. The variance of the eight words was measured under all conditions. Results showed only an increase in variability under the laying and kneeling / vertical UnC. Within the five other postural conditions tested, handwriting was shown to be very stable.
Resumo:
To provide a quantitative support to the handwriting evidence evaluation, a new method was developed through the computation of a likelihood ratio based on a Bayesian approach. In the present paper, the methodology is briefly described and applied to data collected within a simulated case of a threatening letter. Fourier descriptors are used to characterise the shape of loops of handwritten characters "a" of the true writer of the threatening letter, and: 1) with reference characters "a" of the true writer of the threatening letter, and then 2) with characters "a" of a writer who did not write the threatening letter. The findings support that the probabilistic methodology correctly supports either the hypothesis of authorship or the alternative hypothesis. Further developments will enable the handwriting examiner to use this methodology as a helpful assistance to assess the strength of evidence in handwriting casework.
Resumo:
O presente trabalho apresenta um estudo sobre a efetividade dos Laudos Periciais Criminais de Informática no que diz respeito ao auxílio na formação da convicção do magistrado para elaborar as sentenças. Para tanto, foram realizadas pesquisas nos laudos e nas sentenças que utilizaram esses laudos, buscando encontrar relação entre ambos com vistas a analisar a qualidade do Laudo produzido e sua importância para a decisão judicial e, consequentemente, para a promoção da justiça social. O estudo realizado permite afirmar que o trabalho pericial é relevante, na maioria dos casos analisados, para auxiliar os magistrados em suas tomadas de decisões. O resultado da pesquisa revelou que algumas variáveis que não dependem do trabalho pericial, como os questionamentos formulados pelo requisitante do laudo e o tipo penal, são relevantes para que os exames periciais sejam ainda mais efetivos e auxiliem na promoção da Justiça. Esta pesquisa pode ser um instrumento de gestão da Diretoria Técnico-Científica do Departamento de Polícia Federal no sentido de preencher a lacuna hoje existente, tendo em vista que os peritos criminais federais não possuem feedback sobre o trabalho desenvolvido, ao tempo em que demonstra a importância do trabalho pericial para a comprovação de delitos. Servirá também para auxiliar os gestores no desenvolvimento de metodologia de elaboração de laudos periciais de informática que busquem indicar autoria e materialidade delitiva em seus exames. A sociedade precisa que seus órgãos públicos atuem de maneira a promover justiça social para os cidadãos. Nesse cenário, o laudo pericial de informática é um dos instrumentos que podem auxiliar a efetivação da justiça de forma mais concreta.
Resumo:
A Documentoscopia é a maior área de perícia da Criminalística da PF, respondendo por 24,49% de toda a produção de laudos do Sistema Nacional de Criminalística. Apesar disso, não possui área de concurso ou graduação específicas, e o desenvolvimento das competências da área depende quase que exclusivamente da capacitação oferecida e executada internamente, dentro da instituição e do ambiente de trabalho. Considerando os planejamentos estratégicos da Direção Geral e da Diretoria Técnico-Científica da PF, que manifestaram a importância da valorização de seus servidores por meio da capacitação contínua e da gestão de competências como estratégia para se alcançar suas missões, vê-se a relevância no adequado estudo e desenvolvimento das competências na área da perícia documentoscópica. O presente trabalho tem por objetivo analisar se as competências técnicas dos peritos documentoscópico da Polícia Federal elencadas na matriz da função técnica da PF estão em consonância com as elencadas pela ONU para os examinadores forenses de documentos, e se essas competências estão sendo desenvolvidas nas ações de capacitação oferecidas pela ANP voltadas para a área. Foram identificadas algumas lacunas, ou seja, recomendações da ONU que encontram correspondência nas elencadas na matriz, mas não são desenvolvidas pelas ações de capacitação, além da discrepância quanto à carga horária dos cursos. Algumas sugestões para a minimização ou eliminação dessas lacunas foram colocadas, e outras considerações foram feitas, principalmente voltadas à maior oferta de capacitação, à especialização profissional, à instituição de testes de proficiência e da mentoria.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
Inscripto en el análisis del discurso interaccional de tradición angloamericana y optando por un abordaje metodológico etnográfico, este proyecto plantea investigar las relaciones entre textos orales y entre textos orales y escritos vinculados en cadenas textuales en un acontecimiento comunicativo institucional en el que tales relaciones tienen consecuencias directas en la confiabilidad de la información e intervienen en la construcción del conocimiento oficial. La situación comunicativa elegida es el examen y el contraexamen de testigos comunes durante procesos penales orales, de formato común no abreviado, en la jurisdicción de Córdoba capital. En las interacciones verbales con litigantes y jueces en las que emerge el testimonio se ponen en juego otros textos orales (en forma de citas de lo dicho antes por el mismo testigo u otras personas, referencias a rumores u opiniones colectivas, etc.) y textos escritos (actas de secuestro, informes periciales, actas de las declaraciones testimoniales en la etapa de instrucción, etc.). El foco de atención son las prácticas asociadas a la intertextualidad puesto que condicionan el carácter de la prueba testimonial producida ante el juzgador. Postulamos que los litigantes despliegan tácticas locales y estrategias globales reconocibles y recurrentes vinculadas al tratamiento de diversas categorías de textos previos. Además, planteamos averiguar si la participación de los jueces en interacción con los testigos es de suficiente injerencia como para ser un modo importante de generación de prueba testimonial. El enfoque metodológico general es etnográfico y analíticodiscursivo. Se seleccionará una causa por delito grave, se presenciará el debate en la cámara y se registrará el audio de todas las audiencias. Los datos a analizar serán los segmentos en las interacciones en los que se incorpora la lectura o se cita las actas de las declaraciones indagatorias o testimoniales anteriores, y los segmentos en los que se requiere, en calidad de prueba testimonial, la reproducción de dichos. Se procederá a partir de los detalles de la superficie textual y la pragmática de los intercambios y aprovechando el valor heurístico del concepto de voz, buscando identificar patrones recurrentes y los mecanismos generales que los rigen. Sobre esa base, se considerarán los intercambios verbales como interacción social que emerge moldeada por condiciones situacionales e institucionales y otros factores, tales como la incidencia de la pertenencia a grupos sociales o profesionales. Con el estudio se obtendrá una visión de prácticas cotidianas asociadas a la intertextualidad que son de crucial importancia para el carácter de la prueba testimonial producida ante el juzgador. Este paso nos acercará a conocer cómo se lleva a cabo efectivamente la administración de justicia penal y permitirá valorar los patrones de conducta a la luz de las normas procesales. In line with the Anglo-American tradition of situated discourse analysis, this project aims at tracing the links between oral texts and between oral and written texts related in textual chains which are present in an institutional event in which such relations have a direct consequence on the reliability of the information given and have an impact on the construction of what counts as official knowledge. The communicative situation under study is that of the direct and cross-examination of lay witnesses during a criminal trial in the city of Córdoba. During the face-to-face interactions between trial lawyers and judges in which the testimony takes place, other oral texts and written texts get incorporated. The focus of this research is centered on practices of intertextuality as they condition the nature of the oral evidence produced. It is argued that trial lawyers use recurrent local tactics and global strategies that are related to the treatment given to different categories of previous texts. Another aim of this study is to examine if judge’s interventions have an impact on the generation of the oral evidence. The data will come from a criminal trial that will be audio-taped in its entirety. Ethnographic observations of a criminal trial will be made. The focus of analysis will be on segments of interactions in which previous texts are read aloud or incorporated as quotes. After carrying out a detailed analysis of the surface of texts and the pragmatics of the exchanges, recurrent patterns and the general mechanisms that condition their emergence will be described. In this way, verbal exchanges will be considered social interactions that unfold conditioned by situational, institutional and social factors. This study will examine the relationship between intertextuality and the institutional practice of providing oral evidence. This will help understand how justice is actually administered and how patterns of behavior are valued according to institutional norms.
Resumo:
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
The ASTM standards on Writing Ink Identification (ASTM 1789-04) and on Writing Ink Comparison (ASTM 1422-05) are the most up-to-date guidelines that have been published on the forensic analysis of ink. The aim of these documents is to cover most aspects of the forensic analysis of ink evidence, from the analysis of ink samples, the comparison of the analytical profile of these samples (with the aim to differentiate them or not), through to the interpretation of the result of the examination of these samples in a forensic context. Significant evolutions in the technology available to forensic scientists, in the quality assurance requirements brought onto them, and in the understanding of frameworks to interpret forensic evidence have been made in recent years. This article reviews the two standards in the light of these evolutions and proposes some practical improvements in terms of the standardization of the analyses, the comparison of ink samples, and the interpretation of ink examination. Some of these suggestions have already been included in a DHS funded project aimed at creating a digital ink library for the United States Secret Service.