889 resultados para Digital Forensics, Forensic Computing, Forensic Science
Resumo:
In arson cases, the collection and detection of traces of ignitable liquids on a suspect's hands can provide information to a forensic investigation. Police forces currently lack a simple, robust, efficient and reliable solution to perform this type of swabbing. In this article, we describe a study undertaken to develop a procedure for the collection of ignitable liquid residues on the hands of arson suspects. Sixteen different collection supports were considered and their applicability for the collection of gasoline traces present on hands and their subsequent analysis in a laboratory was evaluated. Background contamination, consisting of volatiles emanating from the collection supports, and collection efficiencies of the different sampling materials were assessed by passive headspace extraction with an activated charcoal strip (DFLEX device) followed by gas chromatography-mass spectrometry (GC-MS) analysis. After statistical treatment of the results, non-powdered latex gloves were retained as the most suitable method of sampling. On the basis of the obtained results, a prototype sampling kit was designed and tested. This kit is made of a three compartment multilayer bag enclosed in a sealed metal can and containing three pairs of non-powdered latex gloves: one to be worn by the sampler, one consisting of a blank sample and the last one to be worn by the person suspected to have been in contact with ignitable liquids. The design of the kit was developed to be efficient in preventing external and cross-contaminations.
Resumo:
Fingerprint practitioners rely on level 3 features to make decisions in relation to the source of an unknown friction ridge skin impression. This research proposes to assess the strength of evidence associated with pores when shown in (dis)agreement between a mark and a reference print. Based upon an algorithm designed to automatically detect pores, a metric is defined in order to compare different impressions. From this metric, the weight of the findings is quantified using a likelihood ratio. The results obtained on four configurations and 54 donors show the significant contribution of the pore features and translate into statistical terms what latent fingerprint examiners have developed holistically through experience. The system provides LRs that are indicative of the true state under both the prosecution and the defense propositions. Not only such a system brings transparency regarding the weight to assign to such features, but also forces a discussion in relation to the risks of such a model to mislead.
Resumo:
The purpose of this research is to assess the vulnerabilities of a high resolution fingerprint sensor when confronted with fake fingerprints. The study has not been focused on the decision outcome of the biometric device, but essentially on the scores obtained following the comparison between a query (genuine or fake) and a template using an AFIS system. To do this, fake fingerprints of 12 subjects have been produced with and without their cooperation. These fake fingerprints have been used alongside with real fingers. The study led to three major observations: First, genuine fingerprints produced scores higher than fake fingers (translating a closer proximity) and this tendency is observed considering each subject separately. Second, scores are however not sufficient as a single measure to differentiate these samples (fake from genuine) given the variation due to the donors themselves. That explains why fingerprint readers without vitality detection can be fooled. Third, production methods and subjects greatly influence the scores obtained for fake fingerprints.
Resumo:
Alleles and haplotypes frequencies for 10 Y-chromosome STR loci (DYS19, DYS385 I/II, DYS389I, DYS389II, DYS390, DYS391, DYS392, DYS393, DYS438 and DYS439), included in the Y-Plex6 and Y-Plex5 kits were determined for a Tunisian population sample of 100 male individuals.
Resumo:
A HPLC method is presented for the identification and quantification in plasma and urine of beta-adrenergic receptor antagonists (betaxolol, carteolol, metipranolol, and timolol) commonly prescribed in ophthalmology. An extraction method is described using pindolol as an internal standard. An RSIL 10 micron column was used. The lower detection limits of the beta-blockers were found to be 4-27 ng/ml. This method is simple, rapid and sensitive; moreover, it allows the determination of 8 other beta-blockers.
Resumo:
Recently, modern cross-sectional imaging techniques such as multi-detector computed tomography (MDCT) have pioneered post mortem investigations, especially in forensic medicine. Such approaches can also be used to investigate bones non-invasively for anthropological purposes. Long bones are often examined in forensic cases because they are frequently discovered and transferred to medico-legal departments for investigation. To estimate their age, the trabecular structure must be examined. This study aimed to compare the performance of MDCT with conventional X-rays to investigate the trabecular structure of long bones. Fifty-two dry bones (24 humeri and 28 femora) from anthropological collections were first examined by conventional X-ray, and then by MDCT. Trabecular structure was evaluated by seven observers (two experienced and five inexperienced in anthropology) who analyzed images obtained by radiological methods. Analyses contained the measurement of one quantitative parameter (caput diameter of humerus and femur) and staging the trabecular structure of each bone. Preciseness of each technique was indicated by describing areas of trabecular destruction and particularities of the bones, such as pathological changes. Concerning quantitative parameters, the measurements demonstrate comparable results for the MDCT and conventional X-ray techniques. In contrast, the overall inter-observer reliability of the staging was low with MDCT and conventional X-ray. Reliability increased significantly when only the results of the staging performed by the two experienced observers were compared, particularly regarding the MDCT analysis. Our results also indicate that MDCT appears to be better suited to a detailed examination of the trabecular structure. In our opinion, MDCT is an adequate tool with which to examine the trabecular structure of long bones. However, adequate methods should be developed or existing methods should be adapted to MDCT.
Resumo:
In forensic science, there is a strong interest in determining the post-mortem interval (PMI) of human skeletal remains up to 50 years after death. Currently, there are no reliable methods to resolve PMI, the determination of which relies almost exclusively on the experience of the investigating expert. Here we measured (90)Sr and (210)Pb ((210)Po) incorporated into bones through a biogenic process as indicators of the time elapsed since death. We hypothesised that the activity of radionuclides incorporated into trabecular bone will more accurately match the activity in the environment and the food chain at the time of death than the activity in cortical bone because of a higher remodelling rate. We found that determining (90)Sr can yield reliable PMI estimates as long as a calibration curve exists for (90)Sr covering the studied area and the last 50 years. We also found that adding the activity of (210)Po, a proxy for naturally occurring (210)Pb incorporated through ingestion, to the (90)Sr dating increases the reliability of the PMI value. Our results also show that trabecular bone is subject to both (90)Sr and (210)Po diagenesis. Accordingly, we used a solubility profile method to determine the biogenic radionuclide only, and we are proposing a new method of bone decontamination to be used prior to (90)Sr and (210)Pb dating.
Resumo:
Anti-doping authorities have high expectations of the athlete steroidal passport (ASP) for anabolic-androgenic steroids misuse detection. However, it is still limited to the monitoring of known well-established compounds and might greatly benefit from the discovery of new relevant biomarkers candidates. In this context, steroidomics opens the way to the untargeted simultaneous evaluation of a high number of compounds. Analytical platforms associating the performance of ultra-high pressure liquid chromatography (UHPLC) and the high mass-resolving power of quadrupole time-of-flight (QTOF) mass spectrometers are particularly adapted for such purpose. An untargeted steroidomic approach was proposed to analyse urine samples from a clinical trial for the discovery of relevant biomarkers of testosterone undecanoate oral intake. Automatic peak detection was performed and a filter of reference steroid metabolites mass-to-charge ratio (m/z) values was applied to the raw data to ensure the selection of a subset of steroid-related features. Chemometric tools were applied for the filtering and the analysis of UHPLC-QTOF-MS(E) data. Time kinetics could be assessed with N-way projections to latent structures discriminant analysis (N-PLS-DA) and a detection window was confirmed. Orthogonal projections to latent structures discriminant analysis (O-PLS-DA) classification models were evaluated in a second step to assess the predictive power of both known metabolites and unknown compounds. A shared and unique structure plot (SUS-plot) analysis was performed to select the most promising unknown candidates and receiver operating characteristic (ROC) curves were computed to assess specificity criteria applied in routine doping control. This approach underlined the pertinence to monitor both glucuronide and sulphate steroid conjugates and include them in the athletes passport, while promising biomarkers were also highlighted.
Resumo:
This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.
Resumo:
The literature dealing with the interpretation of results of examinations performed on "printed" documents is very limited. The absence of published literature reflects the absence of formal guidelines to help scientists assess the relationship between a questioned document and a particular printing technology. Generally, every printout, independent of the printing technology, may bear traces induced by characteristics of manufacture and/or acquired features of the printing device. A logical approach to help the scientist in the formal interpretation of such findings involves the consideration of a likelihood ratio. Three examples aim to show the application of this approach.
Resumo:
The aim of this paper is to evaluate the risks associated with the use of fake fingerprints on a livescan supplied with a method of liveness detection. The method is based on optical properties of the skin. The sensor uses several polarizations and illuminations to capture the information of the different layers of the human skin. These experiments also allow for the determination under which conditions the system is deceived and if there is an influence respectively of the nature of the fake, the mould used for the production or the individuals involved in the attack. These experiments showed that current multispectral sensors can be deceived by the use of fake fingerprints created with or without the cooperation of the subject. Fakes created from direct casts perform better than those produced by fakes created from indirect casts. The results showed that the success of the attack is influenced by two main factors. The first is the quality of the fakes, and by extension the quality of the original fingerprint. The second is the combination of the general patterns involved in the attacks since an appropriate combination can strongly increase the rates of successful attacks.