999 resultados para 010403 Forensic Statistics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper will investigate the suitability of existing performance measures under the assumption of a clearly defined benchmark. A range of measures are examined including the Sortino Ratio, the Sharpe Selection ratio (SSR), the Student’s t-test and a decay rate measure. A simulation study is used to assess the power and bias of these measures based on variations in sample size and mean performance of two simulated funds. The Sortino Ratio is found to be the superior performance measure exhibiting more power and less bias than the SSR when the distribution of excess returns are skewed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches, which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, I reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: 'The JAMA Model and Narrative Interpretation Patterns'), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability were infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for Artificial Intelligence (AI) researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially Bayesian probability) in accounts of evidence has been flourishing among legal scholars; nowadays both the Bayesians (e.g. Peter Tillers) and the Bayesio-skeptics (e.g. Ron Allen), among those legal scholars who are involved in the controversy, are willing to give AI research a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application of probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making. Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A crucial aspect of evidential reasoning in crime investigation involves comparing the support that evidence provides for alternative hypotheses. Recent work in forensic statistics has shown how Bayesian Networks (BNs) can be employed for this purpose. However, the specification of BNs requires conditional probability tables describing the uncertain processes under evaluation. When these processes are poorly understood, it is necessary to rely on subjective probabilities provided by experts. Accurate probabilities of this type are normally hard to acquire from experts. Recent work in qualitative reasoning has developed methods to perform probabilistic reasoning using coarser representations. However, the latter types of approaches are too imprecise to compare the likelihood of alternative hypotheses. This paper examines this shortcoming of the qualitative approaches when applied to the aforementioned problem, and identifies and integrates techniques to refine them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmacogenetics deals with genetically determined variation in drug response. In this context, three phase I drug-metabolizing enzymes, CYP2D6, CYP2C9, and CYP2C19, have a central role, affecting the metabolism of about 20-30% of clinically used drugs. Since genes coding for these enzymes in human populations exhibit high genetic polymorphism, they are of major pharmacogenetic importance. The aims of this study were to develop new genotyping methods for CYP2D6, CYP2C9, and CYP2C19 that would cover the most important genetic variants altering the enzyme activity, and, for the first time, to describe the distribution of genetic variation at these loci on global and microgeographic scales. In addition, pharmacogenetics was applied to a postmortem forensic setting to elucidate the role of genetic variation in drug intoxications, focusing mainly on cases related to tricyclic antidepressants, which are commonly involved in fatal drug poisonings in Finland. Genetic variability data were obtained by genotyping new population samples by the methods developed based on PCR and multiplex single-nucleotide primer extension reaction, as well as by collecting data from the literature. Data consisted of 138, 129, and 146 population samples for CYP2D6, CYP2C9, and CYP2C19, respectively. In addition, over 200 postmortem forensic cases were examined with respect to drug and metabolite concentrations and genotypic variation at CYP2D6 and CYP2C19. The distribution of genetic variation within and among human populations was analyzed by descriptive statistics and variance analysis and by correlating the genetic and geographic distances using Mantel tests and spatial autocorrelation. The correlation between phenotypic and genotypic variation in drug metabolism observed in postmortem cases was also analyzed statistically. The genotyping methods developed proved to be informative, technically feasible, and cost-effective. Detailed molecular analysis of CYP2D6 genetic variation in a global survey of human populations revealed that the pattern of variation was similar to those of neutral genomic markers. Most of the CYP2D6 diversity was observed within populations, and the spatial pattern of variation was best described as clinal. On the other hand, genetic variants of CYP2D6, CYP2C9, and CYP2C19 associated with altered enzymatic activity could reach extremely high frequencies in certain geographic regions. Pharmacogenetic variation may also be significantly affected by population-specific demographic histories, as seen within the Finnish population. When pharmacogenetics was applied to a postmortem forensic setting, a correlation between amitriptyline metabolic ratios and genetic variation at CYP2D6 and CYP2C19 was observed in the sample material, even in the presence of confounding factors typical for these cases. In addition, a case of doxepin-related fatal poisoning was shown to be associated with a genetic defect at CYP2D6. Each of the genes studied showed a distinct variation pattern in human populations and high frequencies of altered activity variants, which may reflect the neutral evolution and/or selective pressures caused by dietary or environmental exposure. The results are relevant also from the clinical point of view since the genetic variation at CYP2D6, CYP2C9, and CYP2C19 already has a range of clinical applications, e.g. in cancer treatment and oral anticoagulation therapy. This study revealed that pharmacogenetics may also contribute valuable information to the medicolegal investigation of sudden, unexpected deaths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to examine the structure of the statistics anxiety rating scale. Responses from 650 undergraduate psychology students throughout the UK were collected through an on-line study. Based on previous research three different models were specified and estimated using confirmatory factor analysis. Fit indices were used to determine if the model fitted the data and a likelihood ratio difference test was used to determine the best fitting model. The original six factor model was the best explanation of the data. All six subscales were intercorrelated and internally consistent. It was concluded that the statistics anxiety rating scale was found to measure the six subscales it was designed to assess in a UK population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistics are regularly used to make some form of comparison between trace evidence or deploy the exclusionary principle (Morgan and Bull, 2007) in forensic investigations. Trace evidence are routinely the results of particle size, chemical or modal analyses and as such constitute compositional data. The issue is that compositional data including percentages, parts per million etc. only carry relative information. This may be problematic where a comparison of percentages and other constraint/closed data is deemed a statistically valid and appropriate way to present trace evidence in a court of law. Notwithstanding an awareness of the existence of the constant sum problem since the seminal works of Pearson (1896) and Chayes (1960) and the introduction of the application of log-ratio techniques (Aitchison, 1986; Pawlowsky-Glahn and Egozcue, 2001; Pawlowsky-Glahn and Buccianti, 2011; Tolosana-Delgado and van den Boogaart, 2013) the problem that a constant sum destroys the potential independence of variances and covariances required for correlation regression analysis and empirical multivariate methods (principal component analysis, cluster analysis, discriminant analysis, canonical correlation) is all too often not acknowledged in the statistical treatment of trace evidence. Yet the need for a robust treatment of forensic trace evidence analyses is obvious. This research examines the issues and potential pitfalls for forensic investigators if the constant sum constraint is ignored in the analysis and presentation of forensic trace evidence. Forensic case studies involving particle size and mineral analyses as trace evidence are used to demonstrate the use of a compositional data approach using a centred log-ratio (clr) transformation and multivariate statistical analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many people regard the concept of hypothesis testing as fundamental to inferential statistics. Various schools of thought, in particular frequentist and Bayesian, have promoted radically different solutions for taking a decision about the plausibility of competing hypotheses. Comprehensive philosophical comparisons about their advantages and drawbacks are widely available and continue to span over large debates in the literature. More recently, controversial discussion was initiated by an editorial decision of a scientific journal [1] to refuse any paper submitted for publication containing null hypothesis testing procedures. Since the large majority of papers published in forensic journals propose the evaluation of statistical evidence based on the so called p-values, it is of interest to expose the discussion of this journal's decision within the forensic science community. This paper aims to provide forensic science researchers with a primer on the main concepts and their implications for making informed methodological choices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Between 30% and 90% of the prison population is estimated to have survived traumatic experiences such as sexual, emotional, and physical abuse prior to incarceration (Anonymous, 1999; Fondacaro, Holt, & Powell, 1999; Messina & Grella, 2006; Pollard & Baker, 2000; Veysey, De Cou, & Prescott, 1998). Similarly, information from the Bureau of Justice Statistics (as reported in Warren, 2001) estimated that more than half of the women in state prisons have experienced past physical and sexual abuse. Thus, given the astonishing number of inmates who appear to be victims of some kind of trauma, it seems likely that those who work with these inmates (e.g., prison staff, guards, and treatment providers) will in some way encounter challenges related to the inmates' trauma history. These difficulties may appear in any number of forms including inmates' behavioral outbursts, increased emotionality, sensitivity to triggering situations, and chronic physical or mental health needs (Veysey, et al., 1998). It is also likely that these individuals with trauma histories would benefit greatly from treatment while incarcerated. This treatment could be utilized to minimize symptoms of posttraumatic stress, decrease behavioral problems, and help the inmate function more effectively in society when released from incarceration (Kokorowski & Freng, 2001; Tucker, Cosio, Meshreki, 2003). Few studies have explored the types of trauma treatment that are effective with inmate populations or made specific suggestions for clinicians working in forensic settings (Kokorowski & Freng, 2001). Essentially, there appears to be a large gap in terms of the need for trauma treatment for inmates and the lack of literature addressing what to do about it. However, clinicians across the country seem to be quietly attempting to fulfill this need for trauma treatment with incarcerated populations. They are providing this greatly needed treatment every day. in the face of enormous challenges and often without recognition or the opportunity to share their valuable work with the larger community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"NCJ-128567."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic analysis in animals has been used for many applications, such as kinship analysis, for determining the sire of an offspring when a female has been exposed to multiple males, determining parentage when an animal switches offspring with another dam, extended lineage reconstruction, estimating inbreeding, identification in breed registries, and speciation. It now also is being used increasingly to characterize animal materials in forensic cases. As such, it is important to operate under a set of minimum guidelines that assures that all service providers have a template to follow for quality practices. None have been delineated for animal genetic identity testing. Based on the model for human DNA forensic analyses, a basic discussion of the issues and guidelines is provided for animal testing to include analytical practices, data evaluation, nomenclature, allele designation, statistics, validation, proficiency testing, lineage markers, casework files, and reporting. These should provide a basis for professional societies and/or working groups to establish more formalized recommendations.