889 resultados para Digital Forensics, Forensic Computing, Forensic Science
Resumo:
Blowflies utilize discrete and ephemeral breeding sites for larval nutrition. After the exhaustion of food, larvae begin dispersing in search of sites to pupate or additional food sources, a process referred as postfeeding larval dispersal. Some of the most important aspects of this process were investigated in the blowfly Chrysomya albiceps, employing a circular arena to allow radial dispersion of larvae from the center. The results showed a positive correlation between burial depth and distance, and a negative correlation between distance and pupal weight. These results can be used in forensic entomology for the postmortem interval estimation of human corpses in medico-criminal investigations. (c) 2004 Elsevier B.V.. All rights reserved.
Resumo:
Mitochondrial DNA (mtDNA) population data for forensic purposes are still scarce for some populations, which may limit the evaluation of forensic evidence especially when the rarity of a haplotype needs to be determined in a database search. In order to improve the collection of mtDNA lineages from the Iberian and South American subcontinents, we here report the results of a collaborative study involving nine laboratories from the Spanish and Portuguese Speaking Working Group of the International Society for Forensic Genetics (GHEP-ISFG) and EMPOP. The individual laboratories contributed population data that were generated throughout the past 10 years, but in the majority of cases have not been made available to the scientific community. A total of 1019 haplotypes from Iberia (Basque Country, 2 general Spanish populations, 2 North and 1 Central Portugal populations), and Latin America (3 populations from São Paulo) were collected, reviewed and harmonized according to defined EMPOP criteria. The majority of data ambiguities that were found during the reviewing process (41 in total) were transcription errors confirming that the documentation process is still the most error-prone stage in reporting mtDNA population data, especially when performed manually. This GHEP-EMPOP collaboration has significantly improved the quality of the individual mtDNA datasets and adds mtDNA population data as valuable resource to the EMPOP database (www.empop.org). (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
A new voltammetric method for the determination of Delta(9)-tetrahydrocannabinol (Delta(9)-THC) is described. The voltammetric experiments were accomplished in N-N dimethylformamide/water (9: 1, v/v), using tetrabutylammonium tetrafluoroborate (TBATFB) 0.1 mol/L as supporting electrolyte and a glassy carbon disk electrode as the working electrode. The anodic peak current was observed at 0.0 V (vs. Ag/AgCl) after a 30 s pre-concentration step under an applied potential of -1.2 V (vs. Ag/AgCl). A linear dependence of Delta(9)-THC detection was obtained in the concentration range 2.4-11.3 ng/mL, with a linear correlation coefficient of 0.999 and a detection limit of 0.34 ng/mL. The voltammetric method was used to measure the content of Delta(9)-THC in samples (hemp and hashish) confiscated by the police. The elimination of chemical interferences from the samples was promptly achieved through prior purification using the TLC technique, by employing methanol/water (4: 1, v/v) as the mobile phase. The results showed excellent correlation with results attained by HPLC.
Resumo:
Large inter-individual variability in drug response and toxicity, as well as in drug concentrations after application of the same dosage, can be of genetic, physiological, pathophysiological, or environmental origin. Absorption, distribution and metabolism of a drug and interactions with its target often are determined by genetic differences. Pharmacokinetic and pharmacodynamic variations can appear at the level of drug metabolizing enzymes (e.g., the cytochrome P450 system), drug transporters, drug targets or other biomarker genes. Pharmacogenetics or toxicogenetics can therefore be relevant in forensic toxicology. This review presents relevant aspects together with some examples from daily routines.
Resumo:
OBJECTIVE: To assess the reliability of computed tomography (CT) numbers, also known as Hounsfield-units (HU) in the differentiation and identification of forensically relevant materials and to provide instructions to improve the reproducibility of HU measurements in daily forensic practice. MATERIALS AND METHODS: We scanned a phantom containing non-organic materials (glass, rocks and metals) on three different CT scanners with standardized parameters. The t-test was used to assess the influence of the scanner, the size and shape of different types of regions-of-interest (ROI), the composition and shape of the object, and the reader performance on HU measurements. Intra-class correlation coefficient was used to assess intra- and inter-reader reliability. RESULTS: HU values did not change significantly as a function of ROI-shape or -size (p>0.05). Intra-reader reliability reached ICC values >0.929 (p<0.001). Inter-reader reliability was also excellent with an ICC of 0.994 (p<0.001). Four of seven objects yielded significantly different CT numbers at different levels within the object (p<0.05). In 6/7 objects the HU changed significantly from CT scanner to CT scanner (p<0.05). CONCLUSION: Reproducible CT number measurements can be achieved through correct ROI-placement and repeat measurements within the object of interest. However, HU may differ from CT-scanner to CT-scanner. In order to obtain comparable CT numbers we suggest that a dedicated Forensic Reference Phantom be developed.
Resumo:
The lynx, which was reintroduced to Switzerland after being exterminated at the beginning of the 20th century, is protected by Swiss law. However, poaching occurs from time to time, which makes criminal investigations necessary. In the presented case, an illegally shot lynx was examined by conventional plane radiography and three-dimensional multislice computertomography (3D MSCT), of which the latter yielded superior results with respect to documentation and reconstruction of the inflicted gunshot wounds. We believe that 3D MSCT, already described in human forensic-pathological cases, is also a suitable and promising new technique for veterinary pathology.
Resumo:
Multislice-computed tomography (MSCT) and magnetic resonance imaging (MRI) are increasingly used for forensic purposes. Based on broad experience in clinical neuroimaging, post-mortem MSCT and MRI were performed in 57 forensic cases with the goal to evaluate the radiological methods concerning their usability for forensic head and brain examination. An experienced clinical radiologist evaluated the imaging data. The results were compared to the autopsy findings that served as the gold standard with regard to common forensic neurotrauma findings such as skull fractures, soft tissue lesions of the scalp, various forms of intracranial hemorrhage or signs of increased brain pressure. The sensitivity of the imaging methods ranged from 100% (e.g., heat-induced alterations, intracranial gas) to zero (e.g., mediobasal impression marks as a sign of increased brain pressure, plaques jaunes). The agreement between MRI and CT was 69%. The radiological methods prevalently failed in the detection of lesions smaller than 3mm of size, whereas they were generally satisfactory concerning the evaluation of intracranial hemorrhage. Due to its advanced 2D and 3D post-processing possibilities, CT in particular possessed certain advantages in comparison with autopsy with regard to forensic reconstruction. MRI showed forensically relevant findings not seen during autopsy in several cases. The partly limited sensitivity of imaging that was observed in this retrospective study was based on several factors: besides general technical limitations it became apparent that clinical radiologists require a sound basic forensic background in order to detect specific signs. Focused teaching sessions will be essential to improve the outcome in future examinations. On the other hand, the autopsy protocols should be further standardized to allow an exact comparison of imaging and autopsy data. In consideration of these facts, MRI and CT have the power to play an important role in future forensic neuropathological examination.
Resumo:
Stable Isotope Ratio Analysis (SIRA) is the measurement of variation in different isotopes of same elements in a material. This technique is well-established in the natural sciences and has been long part of the methodological arsenal in fields such as geology and biology. More recently this technique has begun to be utilized in the social sciences, moving from initial applications in anthropology to potential uses in geography, public health, forensic science, and others. This presentation will discuss the techniques behind SIRA, examples of current applications in the natural and social sciences, and potential avenues of future research.
Resumo:
Forensic radiology is a new subspecialty that has arisen worldwide in the field of forensic medicine. Postmortem computed tomography (PMCT) and, to a lesser extent, PMCT angiography (PMCTA), are established imaging methods that have replaced dated conventional X-ray images in morgues. However, these methods have not been standardized for postmortem imaging. Therefore, this article outlines the main approach for a recommended standard protocol for postmortem cross-sectional imaging that focuses on unenhanced PMCT and PMCTA. This review should facilitate the implementation of a high-quality protocol that enables standardized reporting in morgues, associated hospitals or private practices that perform forensic scans to provide the same quality that clinical scans provide in court.
Resumo:
Until today, most of the documentation of forensic relevant medical findings is limited to traditional 2D photography, 2D conventional radiographs, sketches and verbal description. There are still some limitations of the classic documentation in forensic science especially if a 3D documentation is necessary. The goal of this paper is to demonstrate new 3D real data based geo-metric technology approaches. This paper present approaches to a 3D geo-metric documentation of injuries on the body surface and internal injuries in the living and deceased cases. Using modern imaging methods such as photogrammetry, optical surface and radiological CT/MRI scanning in combination it could be demonstrated that a real, full 3D data based individual documentation of the body surface and internal structures is possible in a non-invasive and non-destructive manner. Using the data merging/fusing and animation possibilities, it is possible to answer reconstructive questions of the dynamic development of patterned injuries (morphologic imprints) and to evaluate the possibility, that they are matchable or linkable to suspected injury-causing instruments. For the first time, to our knowledge, the method of optical and radiological 3D scanning was used to document the forensic relevant injuries of human body in combination with vehicle damages. By this complementary documentation approach, individual forensic real data based analysis and animation were possible linking body injuries to vehicle deformations or damages. These data allow conclusions to be drawn for automobile accident research, optimization of vehicle safety (pedestrian and passenger) and for further development of crash dummies. Real 3D data based documentation opens a new horizon for scientific reconstruction and animation by bringing added value and a real quality improvement in forensic science.
Resumo:
El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.
Resumo:
Genetic analysis in animals has been used for many applications, such as kinship analysis, for determining the sire of an offspring when a female has been exposed to multiple males, determining parentage when an animal switches offspring with another dam, extended lineage reconstruction, estimating inbreeding, identification in breed registries, and speciation. It now also is being used increasingly to characterize animal materials in forensic cases. As such, it is important to operate under a set of minimum guidelines that assures that all service providers have a template to follow for quality practices. None have been delineated for animal genetic identity testing. Based on the model for human DNA forensic analyses, a basic discussion of the issues and guidelines is provided for animal testing to include analytical practices, data evaluation, nomenclature, allele designation, statistics, validation, proficiency testing, lineage markers, casework files, and reporting. These should provide a basis for professional societies and/or working groups to establish more formalized recommendations.
Resumo:
Consideration of regulatory issues covering exclusionary DNA of forensic workers - probative effect of eliminating extraneous DNA in a criminal prosecution - current regulatory scheme leaves the legal position of forensic workers' exclusionary DNA obscure.