887 resultados para Computer forensic analysis
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
Biochemical systems are commonly modelled by systems of ordinary differential equations (ODEs). A particular class of such models called S-systems have recently gained popularity in biochemical system modelling. The parameters of an S-system are usually estimated from time-course profiles. However, finding these estimates is a difficult computational problem. Moreover, although several methods have been recently proposed to solve this problem for ideal profiles, relatively little progress has been reported for noisy profiles. We describe a special feature of a Newton-flow optimisation problem associated with S-system parameter estimation. This enables us to significantly reduce the search space, and also lends itself to parameter estimation for noisy data. We illustrate the applicability of our method by applying it to noisy time-course data synthetically produced from previously published 4- and 30-dimensional S-systems. In addition, we propose an extension of our method that allows the detection of network topologies for small S-systems. We introduce a new method for estimating S-system parameters from time-course profiles. We show that the performance of this method compares favorably with competing methods for ideal profiles, and that it also allows the determination of parameters for noisy profiles.
Resumo:
Because of the various matrices available for forensic investigations, the development of versatile analytical approaches allowing the simultaneous determination of drugs is challenging. The aim of this work was to assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) platform allowing the rapid quantification of colchicine in body fluids and tissues collected in the context of a fatal overdose. For this purpose, filter paper was used as a sampling support and was associated with an automated 96-well plate extraction performed by the LC autosampler itself. The developed method features a 7-min total run time including automated filter paper extraction (2 min) and chromatographic separation (5 min). The sample preparation was reduced to a minimum regardless of the matrix analyzed. This platform was fully validated for dried blood spots (DBS) in the toxic concentration range of colchicine. The DBS calibration curve was applied successfully to quantification in all other matrices (body fluids and tissues) except for bile, where an excessive matrix effect was found. The distribution of colchicine for a fatal overdose case was reported as follows: peripheral blood, 29 ng/ml; urine, 94 ng/ml; vitreous humour and cerebrospinal fluid, < 5 ng/ml; pericardial fluid, 14 ng/ml; brain, < 5 pg/mg; heart, 121 pg/mg; kidney, 245 pg/mg; and liver, 143 pg/mg. Although filter paper is usually employed for DBS, we report here the extension of this alternative sampling support to the analysis of other body fluids and tissues. The developed platform represents a rapid and versatile approach for drug determination in multiple forensic media.
Resumo:
A simple wipe sampling procedure was developed for the surface contamination determination of ten cytotoxic drugs: cytarabine, gemcitabine, methotrexate, etoposide phosphate, cyclophosphamide, ifosfamide, irinotecan, doxorubicin, epirubicin and vincristine. Wiping was performed using Whatman filter paper on different surfaces such as stainless steel, polypropylene, polystyrol, glass, latex gloves, computer mouse and coated paperboard. Wiping and desorption procedures were investigated: The same solution containing 20% acetonitrile and 0.1% formic acid in water gave the best results. After ultrasonic desorption and then centrifugation, samples were analysed by a validated liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) in selected reaction monitoring mode. The whole analytical strategy from wipe sampling to LC-MS/MS analysis was evaluated to determine quantitative performance. The lowest limit of quantification of 10 ng per wiping sample (i.e. 0.1 ng cm(-2)) was determined for the ten investigated cytotoxic drugs. Relative standard deviation for intermediate precision was always inferior to 20%. As recovery was dependent on the tested surface for each drug, a correction factor was determined and applied for real samples. The method was then successfully applied at the cytotoxic production unit of the Geneva University Hospitals pharmacy.
Resumo:
Fatty acids are the basis of so-called stearates which are frequently used as lubricants in the production of ecstasy tablets. Being a product added at the initial tablet production step its composition does not change once the compression is performed. The analysis of fatty acids can therefore provide useful information for a drug intelligence purpose. In this context an appropriate analytical method was developed to improve results already obtained by routine analyses. Considering the small quantity of such fatty acids in ecstasy tablets (not, vert, similar3%) the research focussed on their extraction and concentration. Two different procedures were tested: (1) liquid/liquid extraction using dichloromethane followed by derivatisation and (2) in situ transesterification using bortrifluoride. Analyses were performed by GC-MS. The two procedures were optimized and applied to eight ecstasy seizures, in order to choose one of the procedures for its application to a large ecstasy sample set. They were compared by considering the number of peaks detected and sample amount needed, reproducibility and other technical aspects.
Resumo:
Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a 'target' comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a 'signal'. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.
Resumo:
Tribulus terrestris is a nutritional supplement highly debated regarding its physiological and actual effects on the organism. The main claimed effect is an increase of testosterone anabolic and androgenic action through the activation of endogenous testosterone production. Even if this biological pathway is not entirely proven, T. terrestris is regularly used by athletes. Recently, the analysis of two female urine samples by GC/C/IRMS (gas chromatography/combustion/isotope-ratio-mass-spectrometry) conclusively revealed the administration of exogenous testosterone or its precursors, even if the testosterone glucuronide/epitestosterone glucuronide (T/E) ratio and steroid marker concentrations were below the cut-off values defined by World Anti-Doping Agency (WADA). To argue against this adverse analytical finding, the athletes recognized having used T. terrestris in their diet. In order to test this hypothesis, two female volunteers ingested 500 mg of T. terrestris, three times a day and for two consecutive days. All spot urines were collected during 48 h after the first intake. The (13)C/(12)C ratio of ketosteroids was determined by GC/C/IRMS, the T/E ratio and DHEA concentrations were measured by GC/MS and LH concentrations by radioimmunoassay. None of these parameters revealed a significant variation or increased above the WADA cut-off limits. Hence, the short-term treatment with T. terrestris showed no impact on the endogenous testosterone metabolism of the two subjects.
Resumo:
Urine samples from 20 male volunteers of European Caucasian origin were stored at 4 degrees C over a 4-month period in order to compare the identification potential of nuclear DNA (nDNA) and mitochondrial DNA (mtDNA) markers. The amount of nDNA recovered from urines dramatically declined over time. Consequently, nDNA likelihood ratios (LRs) greater than 1,000 were obtained for 100, 70 and 55% of the urines analysed after 6, 60 and 120 days, respectively. For the mtDNA, HVI and HVII sequences were obtained for all samples tested, whatever the period considered. Nevertheless, the highest mtDNA LR of 435 was relatively low compared to its nDNA equivalent. Indeed, LRs obtained with only three nDNA loci could easily exceed this value and are quite easier to obtain. Overall, the joint use of nDNA and mtDNA markers enabled the 20 urine samples to be identified, even after the 4-month period.
Resumo:
RATIONALE The choice of containers for storage of aqueous samples between their collection, transport and water hydrogen (2H) and oxygen (18O) stable isotope analysis is a topic of concern for a wide range of fields in environmental, geological, biomedical, food, and forensic sciences. The transport and separation of water molecules during water vapor or liquid uptake by sorption or solution and the diffusive transport of water molecules through organic polymer material by permeation or pervaporation may entail an isotopic fractionation. An experiment was conducted to evaluate the extent of such fractionation. METHODS Sixteen bottle-like containers of eleven different organic polymers, including low and high density polyethylene (LDPE and HDPE), polypropylene (PP), polycarbonate (PC), polyethylene terephthalate (PET), and perfluoroalkoxy-Teflon (PFA), of different wall thickness and size were completely filled with the same mineral water and stored for 659?days under the same conditions of temperature and humidity. Particular care was exercised to keep the bottles tightly closed and prevent loss of water vapor through the seals. RESULTS Changes of up to +5 parts per thousand for d2H values and +2.0 parts per thousand for d18O values were measured for water after more than 1?year of storage within a plastic container, with the magnitude of change depending mainly on the type of organic polymer, wall thickness, and container size. The most important variations were measured for the PET and PC bottles. Waters stored in glass bottles with Polyseal (TM) cone-lined PP screw caps and thick-walled HDPE or PFA containers with linerless screw caps having an integrally molded inner sealing ring preserved their original d2H and d18O values. The carbon, hydrogen, and oxygen stable isotope compositions of the organic polymeric materials were also determined. CONCLUSIONS The results of this study clearly show that for precise and accurate measurements of the water stable isotope composition in aqueous solutions, rigorous sampling and storage procedures are needed both for laboratory standards and for unknown samples. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
Resumo:
The right to be treated humanely when detained is universally recognized. Deficiencies in detention conditions and violence, however, subvert this right. When this occurs, proper medico-legal investigations are critical irrespective of the nature of death. Unfortunately, the very context of custody raises serious concerns over the effectiveness and fairness of medico-legal examinations. The aim of this manuscript is to identify and discuss the practical and ethical difficulties encountered in the medico-legal investigation following deaths in custody. Data for this manuscript come from a larger project on Death in Custody that examined the causes of deaths in custody and the conditions under which these deaths should be investigated and prevented. A total of 33 stakeholders from forensic medicine, law, prison administration or national human rights administration were interviewed. Data obtained were analyzed qualitatively. Forensic experts are an essential part of the criminal justice process as they offer evidence for subsequent indictment and eventual punishment of perpetrators. Their independence when investigating a death in custody was deemed critical and lack thereof, problematic. When experts were not independent, concerns arose in relation to conflicts of interest, biased perspectives, and low-quality forensic reports. The solutions to ensure independent forensic investigations of deaths in custody must be structural and simple: setting binding standards of practice rather than detailed procedures and relying on preexisting national practices as opposed to encouraging new practices that are unattainable for countries with limited resources.
Resumo:
Signal search analysis is a general method to discover and characterize sequence motifs that are positionally correlated with a functional site (e.g. a transcription or translation start site). The method has played an instrumental role in the analysis of eukaryotic promoter elements. The signal search analysis server provides access to four different computer programs as well as to a large number of precompiled functional site collections. The programs offered allow: (i) the identification of non-random sequence regions under evolutionary constraint; (ii) the detection of consensus sequence-based motifs that are over- or under-represented at a particular distance from a functional site; (iii) the analysis of the positional distribution of a consensus sequence- or weight matrix-based sequence motif around a functional site; and (iv) the optimization of a weight matrix description of a locally over-represented sequence motif. These programs can be accessed at: http://www.isrec.isb-sib.ch/ssa/.