151 resultados para Computer forensic analysis

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work describes a fast gas chromatography/negative-ion chemical ionization tandem mass spectrometric assay (Fast GC/NICI-MS/MS) for analysis of tetrahydrocannabinol (THC), 11-hydroxy-tetrahydrocannabinol (THC-OH) and 11-nor-9-carboxy-tetrahydrocannabinol (THC-COOH) in whole blood. The cannabinoids were extracted from 500 microL of whole blood by a simple liquid-liquid extraction (LLE) and then derivatized by using trifluoroacetic anhydride (TFAA) and hexafluoro-2-propanol (HFIP) as fluorinated agents. Mass spectrometric detection of the analytes was performed in the selected reaction-monitoring mode on a triple quadrupole instrument after negative-ion chemical ionization. The assay was found to be linear in the concentration range of 0.5-20 ng/mL for THC and THC-OH, and of 2.5-100 ng/mL for THC-COOH. Repeatability and intermediate precision were found less than 12% for all concentrations tested. Under standard chromatographic conditions, the run cycle time would have been 15 min. By using fast conditions of separation, the assay analysis time has been reduced to 5 min, without compromising the chromatographic resolution. Finally, a simple approach for estimating the uncertainty measurement is presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Laser desorption ionisation mass spectrometry (LDI-MS) has demonstrated to be an excellent analytical method for the forensic analysis of inks on a questioned document. The ink can be analysed directly on its substrate (paper) and hence offers a fast method of analysis as sample preparation is kept to a minimum and more importantly, damage to the document is minimised. LDI-MS has also previously been reported to provide a high power of discrimination in the statistical comparison of ink samples and has the potential to be introduced as part of routine ink analysis. This paper looks into the methodology further and evaluates statistically the reproducibility and the influence of paper on black gel pen ink LDI-MS spectra; by comparing spectra of three different black gel pen inks on three different paper substrates. Although generally minimal, the influences of sample homogeneity and paper type were found to be sample dependent. This should be taken into account to avoid the risk of false differentiation of black gel pen ink samples. Other statistical approaches such as principal component analysis (PCA) proved to be a good alternative to correlation coefficients for the comparison of whole mass spectra.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ASTM standards on Writing Ink Identification (ASTM 1789-04) and on Writing Ink Comparison (ASTM 1422-05) are the most up-to-date guidelines that have been published on the forensic analysis of ink. The aim of these documents is to cover most aspects of the forensic analysis of ink evidence, from the analysis of ink samples, the comparison of the analytical profile of these samples (with the aim to differentiate them or not), through to the interpretation of the result of the examination of these samples in a forensic context. Significant evolutions in the technology available to forensic scientists, in the quality assurance requirements brought onto them, and in the understanding of frameworks to interpret forensic evidence have been made in recent years. This article reviews the two standards in the light of these evolutions and proposes some practical improvements in terms of the standardization of the analyses, the comparison of ink samples, and the interpretation of ink examination. Some of these suggestions have already been included in a DHS funded project aimed at creating a digital ink library for the United States Secret Service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In order to understand relationships between executive and structural deficits in the frontal cortex of patients within normal aging or Alzheimer's disease, we studied frontal pathological changes in young and old controls compared to cases with sporadic (AD) or familial Alzheimer's disease (FAD). We performed a semi-automatic computer assisted analysis of the distribution of beta-amyloid (Abeta) deposits revealed by Abeta immunostaining as well as of neurofibrillary tangles (NFT) revealed by Gallyas silver staining in Brodman areas 10 (frontal polar), 12 (ventro-infero-median) and 24 (anterior cingular), using tissue samples from 5 FAD, 6 sporadic AD and 10 control brains. We also performed densitometric measurements of glial fibrillary acidic protein, principal compound of intermediate filaments of astrocytes, and of phosphorylated neurofilament H and M epitopes in areas 10 and 24. All regions studied seem almost completely spared in normal old controls, with only the oldest ones exhibiting a weak percentage of beta-amyloid deposit and hardly any NFT. On the contrary, all AD and FAD cases were severely damaged as shown by statistically significant increased percentages of beta-amyloid deposit, as well as by a high number of NFT. FAD cases (all from the same family) had statistically more beta-amyloid and GFAP than sporadic AD cases in both areas 10 and 24 and statistically more NFT only in area 24. The correlation between the percentage of beta-amyloid and the number of NFT was significant only for area 24. Altogether, these data suggest that the frontal cortex can be spared by AD type lesions in normal aging, but is severely damaged in sporadic and still more in familial Alzheimer's disease. The frontal regions appear to be differentially vulnerable, with area 12 having the less amyloid burden, area 24 the less NFT and area 10 having both more amyloid and more NFT. This pattern of damage in frontal regions may represent a strong neuroanatomical support for the deterioration of attention and cognitive capacities as well as for the presence of emotional and behavioral troubles in AD patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: To explore whether triaxial accelerometric measurements can be utilized to accurately assess speed and incline of running in free-living conditions. METHODS: Body accelerations during running were recorded at the lower back and at the heel by a portable data logger in 20 human subjects, 10 men, and 10 women. After parameterizing body accelerations, two neural networks were designed to recognize each running pattern and calculate speed and incline. Each subject ran 18 times on outdoor roads at various speeds and inclines; 12 runs were used to calibrate the neural networks whereas the 6 other runs were used to validate the model. RESULTS: A small difference between the estimated and the actual values was observed: the square root of the mean square error (RMSE) was 0.12 m x s(-1) for speed and 0.014 radiant (rad) (or 1.4% in absolute value) for incline. Multiple regression analysis allowed accurate prediction of speed (RMSE = 0.14 m x s(-1)) but not of incline (RMSE = 0.026 rad or 2.6% slope). CONCLUSION: Triaxial accelerometric measurements allows an accurate estimation of speed of running and incline of terrain (the latter with more uncertainty). This will permit the validation of the energetic results generated on the treadmill as applied to more physiological unconstrained running conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Our contribution aims to explore some intersections between forensic science and criminology through the notion of time. The two disciplines analyse the vestiges of illicit activities in order to reconstruct and understand the past, and occasionally to prevent future harms. While forensic science study the material and digital traces as signs of criminal activities and repetitions, criminology contributes to the acquisition of knowledge through its analysis of crime, its authors and victims, as well as social (re)actions to harmful behaviours. Exploratory, our contribution proposes a conceptual delimitation of the notion of time considering its importance in the study of criminality and harms. Through examples, we propose a "crimino-forensic" analysis of three types of actions of social control - prevention, investigation and intelligence - through their respective temporality (before, near or during and after the criminal activity or harm). The temporal issues of the different methodologies developed to appreciate the efficiency of these actions are also addressed to highlight the connections between forensic science and criminology. This attempt to classify the relations between different times and actions of social control are discussed through the multiple benefits and challenges carried out by the formalisation of fusing those two sciences. Notre contribution vise à explorer quelques intersections entre la science forensique (ou criminalistique) et la criminologie au travers de la notion de temps. En effet, les deux disciplines ont en commun qu'elles analysent les vestiges du phénomène criminel pour tenter de reconstruire et comprendre le passé et parfois prévenir de futurs incidents. Alors que la science forensique étudie les traces matérielles et numériques comme signe d'activités et de répétitions criminelles, la criminologie contribue à l'avancée des connaissances en ce domaine par son analyse des comportements contraires aux normes, de leurs auteurs et de leurs victimes, ainsi que des (ré)actions sociales à ces comportements. A but exploratoire, notre contribution propose une délimitation conceptuelle de la notion de temps en regard de l'importance que revêtent ses différentes manifestations dans l'étude de la criminalité. A l'appui d'exemples, nous proposons une analyse « crimino-forensique » de trois types d'action de contrôle social - la prévention, l'investigation et le renseignement - en fonction de leur temporalité respective (avant, proche voire pendant et après l'activité criminelle). Les enjeux temporels entourant les différentes stratégies méthodologiques développées pour apprécier l'efficacité de ces actions sont aussi abordés pour mettre en évidence des pistes d'intégration entre la science forensique et la criminologie. Cet essai de classification des relations entre les temps et ces trois actions de contrôle social est discuté sous l'angle des bénéfices, multiples, mais aussi des défis, que pose la formalisation des liens entre ces deux disciplines des sciences criminelles.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.