887 resultados para Computer forensic analysis
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for rapid processing of the FWD data along with a user manual. The software system automatically reads the FWD raw data collected by the JILS-20 type FWD machine that Iowa DOT owns, processes and analyzes the collected data with the rapid prediction algorithms developed during the phase I study. This system smoothly integrates the FWD data analysis algorithms and the computer program being used to collect the pavement deflection data. This system can be used to assess pavement condition, estimate remaining pavement life, and eventually help assess pavement rehabilitation strategies by the Iowa DOT pavement management team. This report describes the developed software in detail and can also be used as a user-manual for conducting simulation studies and detailed analyses. *********************** Large File ***********************
Resumo:
In the vast majority of bottom-up proteomics studies, protein digestion is performed using only mammalian trypsin. Although it is clearly the best enzyme available, the sole use of trypsin rarely leads to complete sequence coverage, even for abundant proteins. It is commonly assumed that this is because many tryptic peptides are either too short or too long to be identified by RPLC-MS/MS. We show through in silico analysis that 20-30% of the total sequence of three proteomes (Schizosaccharomyces pombe, Saccharomyces cerevisiae, and Homo sapiens) is expected to be covered by Large post-Trypsin Peptides (LpTPs) with M(r) above 3000 Da. We then established size exclusion chromatography to fractionate complex yeast tryptic digests into pools of peptides based on size. We found that secondary digestion of LpTPs followed by LC-MS/MS analysis leads to a significant increase in identified proteins and a 32-50% relative increase in average sequence coverage compared to trypsin digestion alone. Application of the developed strategy to analyze the phosphoproteomes of S. pombe and of a human cell line identified a significant fraction of novel phosphosites. Overall our data indicate that specific targeting of LpTPs can complement standard bottom-up workflows to reveal a largely neglected portion of the proteome.
Resumo:
This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.
Resumo:
BACKGROUND AND METHODS: The objectives of this article were to systematically describe and examine the novel roles and responsibilities assumed by nurses in a forensic consultation for victims of violence at a University Hospital in French-speaking Switzerland. Utilizing a case study methodology, information was collected from two main sources: (a) discussion groups with nurses and forensic pathologists and (b) a review of procedures and protocols. Following a critical content analysis, the roles and responsibilities of the forensic nurses were described and compared with the seven core competencies of advanced nursing practice as outlined by Hamric, Spross, and Hanson (2009). RESULTS: Advanced nursing practice competencies noted in the analysis included "direct clinical practice," "coaching and guidance," and "collaboration." The role of the nurse in terms of "consultation," "leadership," "ethics," and "research" was less evident in the analysis. DISCUSSION AND CONCLUSION: New forms of nursing are indeed practiced in the forensic clinical setting, and our findings suggest that nursing practice in this domain is following the footprints of an advanced nursing practice model. Further reflections are required to determine whether the role of the forensic nurse in Switzerland should be developed as a clinical nurse specialist or that of a nurse practitioner.
Resumo:
The aim of this work is to present some practical, postmortem biochemistry applications to illustrate the usefulness of this discipline and reassert the importance of carrying out biochemical investigations as an integral part of the autopsy process. Five case reports are presented pertaining to diabetic ketoacidosis in an adult who was not known to suffer from diabetes and in presence of multiple psychotropic substances; fatal flecainide intoxication in a poor metabolizer also presenting an impaired renal function; diabetic ketoacidosis showing severe postmortem changes; primary aldosteronism presented with intracranial hemorrhage and hypothermia showing severe postmortem changes. The cases herein presented can be considered representative examples of the importance of postmortem biochemistry investigations, which may provide significant information useful in determining the cause of death in routine forensic casework or contribute to understanding the pathophysiological mechanisms involved in the death process.
Resumo:
Forensic science is generally defined as the application of science to address questions related to the law. Too often, this view restricts the contribution of science to one single process which eventually aims at bringing individuals to court while minimising risk of miscarriage of justice. In order to go beyond this paradigm, we propose to refocus the attention towards traces themselves, as remnants of a criminal activity, and their information content. We postulate that traces contribute effectively to a wide variety of other informational processes that support decision making inmany situations. In particular, they inform actors of new policing strategies who place the treatment of information and intelligence at the centre of their systems. This contribution of forensic science to these security oriented models is still not well identified and captured. In order to create the best condition for the development of forensic intelligence, we suggest a framework that connects forensic science to intelligence-led policing (part I). Crime scene attendance and processing can be envisaged within this view. This approach gives indications abouthowto structure knowledge used by crime scene examiners in their effective practice (part II).
Resumo:
The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.
Resumo:
The overall system is designed to permit automatic collection of delamination field data for bridge decks. In addition to measuring and recording the data in the field, the system provides for transferring the recorded data to a personal computer for processing and plotting. This permits rapid turnaround from data collection to a finished plot of the results in a fraction of the time previously required for manual analysis of the analog data captured on a strip chart recorder. In normal operation the Delamtect provides an analog voltage for each of two channels which is proportional to the extent of any delamination. These voltages are recorded on a strip chart for later visual analysis. An event marker voltage, produced by a momentary push button on the handle, is also provided by the Delamtect and recorded on a third channel of the analog recorder.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
Research has shown that one of the major contributing factors in early joint deterioration of portland cement concrete (PCC) pavement is the quality of the coarse aggregate. Conventional physical and freeze/thaw tests are slow and not satisfactory in evaluating aggregate quality. In the last ten years the Iowa DOT has been evaluating X-ray analysis and other new technologies to predict aggregate durability in PCC pavement. The objective of this research is to evaluate thermogravimetric analysis (TGA) of carbonate aggregate. The TGA testing has been conducted with a TA 2950 Thermogravimetric Analyzer. The equipment is controlled by an IBM compatible computer. A "TA Hi-RES" (trademark) software package allows for rapid testing while retaining high resolution. The carbon dioxide is driven off the dolomite fraction between 705 deg C and 745 deg C and off the calcite fraction between 905 deg C and 940 deg C. The graphical plot of the temperature and weight loss using the same sample size and test procedure demonstrates that the test is very accurate and repeatable. A substantial number of both dolomites and limestones (calcites) have been subjected to TGA testing. The slopes of the weight loss plot prior to the dolomite and calcite transitions does correlate with field performance. The noncarbonate fraction, which correlates to the acid insolubles, can be determined by TGA for most calcites and some dolomites. TGA has provided information that can be used to help predict the quality of carbonate aggregate.
Resumo:
FGFR1 mutations have been identified in both Kallmann syndrome and normosmic HH (nIHH). To date, few mutations in the FGFR1 gene have been structurally or functionally characterized in vitro to identify molecular mechanisms that contribute to the disease pathogenesis. We attempted to define the in vitro functionality of two FGFR1 mutants (R254W and R254Q), resulting from two different amino acid substitutions of the same residue, and to correlate the in vitro findings to the patient phenotypes. Two unrelated GnRH deficient probands were found to harbor mutations in FGFR1 (R254W and R254Q). Mutant signaling activity and expression levels were evaluated in vitro and compared to a wild type (WT) receptor. Signaling activity was determined by a FGF2/FGFR1 dependent transcription reporter assay. Receptor total expression levels were assessed by Western blot and cell surface expression was measured by a radiolabeled antibody binding assay. The R254W maximal receptor signaling capacity was reduced by 45% (p<0.01) while R254Q activity was not different from WT. However, both mutants displayed diminished total protein expression levels (40 and 30% reduction relative to WT, respectively), while protein maturation was unaffected. Accordingly, cell surface expression levels of the mutant receptors were also significantly reduced (35% p<0.01 and 15% p<0.05, respectively). The p.R254W and p.R254Q are both loss-of-function mutations as demonstrated by their reduced overall and cell surface expression levels suggesting a deleterious effect on receptor folding and stability. It appears that a tryptophan substitution at R254 is more disruptive to receptor structure than the more conserved glutamine substitution. No clear correlation between the severity of in vitro loss-of-function and phenotypic presentation could be assigned.