915 resultados para Database search Evidential value Bayesian decision theory Influence diagrams
Resumo:
Estimating the time since discharge of a spent cartridge or a firearm can be useful in criminal situa-tions involving firearms. The analysis of volatile gunshot residue remaining after shooting using solid-phase microextraction (SPME) followed by gas chromatography (GC) was proposed to meet this objective. However, current interpretative models suffer from several conceptual drawbacks which render them inadequate to assess the evidential value of a given measurement. This paper aims to fill this gap by proposing a logical approach based on the assessment of likelihood ratios. A probabilistic model was thus developed and applied to a hypothetical scenario where alternative hy-potheses about the discharge time of a spent cartridge found on a crime scene were forwarded. In order to estimate the parameters required to implement this solution, a non-linear regression model was proposed and applied to real published data. The proposed approach proved to be a valuable method for interpreting aging-related data.
Resumo:
Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.
Resumo:
This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.
Resumo:
This paper introduces a mixture model based on the beta distribution, without preestablishedmeans and variances, to analyze a large set of Beauty-Contest data obtainedfrom diverse groups of experiments (Bosch-Domenech et al. 2002). This model gives a bettert of the experimental data, and more precision to the hypothesis that a large proportionof individuals follow a common pattern of reasoning, described as iterated best reply (degenerate),than mixture models based on the normal distribution. The analysis shows thatthe means of the distributions across the groups of experiments are pretty stable, while theproportions of choices at dierent levels of reasoning vary across groups.
Resumo:
Changes in human lives are studied in psychology, sociology, and adjacent fields as outcomes of developmental processes, institutional regulations and policies, culturally and normatively structured life courses, or empirical accounts. However, such studies have used a wide range of complementary, but often divergent, concepts. This review has two aims. First, we report on the structure that has emerged from scientific life course research by focusing on abstracts from longitudinal and life course studies beginning with the year 2000. Second, we provide a sense of the disciplinary diversity of the field and assess the value of the concept of 'vulnerability' as a heuristic tool for studying human lives. Applying correspondence analysis to 10,632 scientific abstracts, we find a disciplinary divide between psychology and sociology, and observe indications of both similarities of-and differences between-studies, driven at least partly by the data and methods employed. We also find that vulnerability takes a central position in this scientific field, which leads us to suggest several reasons to see value in pursuing theory development for longitudinal and life course studies in this direction.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
Resumo:
Possible new ways in the pharmacological treatment of bipolar disorder and comorbid alcoholism. Azorin JM, Bowden CL, Garay RP, Perugi G, Vieta E, Young AH. Source Department of Psychiatry, CHU Sainte Marguerite, Marseilles, France. Abstract About half of all bipolar patients have an alcohol abuse problem at some point of their lifetime. However, only one randomized, controlled trial of pharmacotherapy (valproate) in this patient population was published as of 2006. Therefore, we reviewed clinical trials in this indication of the last four years (using mood stabilizers, atypical antipsychotics, and other drugs). Priority was given to randomized trials, comparing drugs with placebo or active comparator. Published studies were found through systematic database search (PubMed, Scirus, EMBASE, Cochrane Library, Science Direct). In these last four years, the only randomized, clinically relevant study in bipolar patients with comorbid alcoholism is that of Brown and colleagues (2008) showing that quetiapine therapy decreased depressive symptoms in the early weeks of use, without modifying alcohol use. Several other open-label trials have been generally positive and support the efficacy and tolerability of agents from different classes in this patient population. Valproate efficacy to reduce excessive alcohol consumption in bipolar patients was confirmed and new controlled studies revealed its therapeutic benefit to prevent relapse in newly abstinent alcoholics and to improve alcohol hallucinosis. Topiramate deserves to be investigated in bipolar patients with comorbid alcoholism since this compound effectively improves physical health and quality of life of alcohol-dependent individuals. In conclusion, randomized, controlled research is still needed to provide guidelines for possible use of valproate and other agents in patients with a dual diagnosis of bipolar disorder and substance abuse or dependence.
Resumo:
Marketing has studied the permanence of a client within an enterprise because it is a key element in the study of the value (economic) of the client (CLV). The research that they have developed is based in deterministic or random models, which allowed estimating the permanence of the client, and the CLV. However, when it is not possible to apply these schemes for not having the panel data that this model requires, the period of time of a client with the enterprise is uncertain data. We consider that the value of the current work is to have an alternative way to estimate the period of time with subjective information proper of the theory of uncertainty.
Resumo:
Since there are some concerns about the effectiveness of highly active antiretroviral therapy in developing countries, we compared the initial combination antiretroviral therapy with zidovudine and lamivudine plus either nelfinavir or efavirenz at a university-based outpatient service in Brazil. This was a retrospective comparative cohort study carried out in a tertiary level hospital. A total of 194 patients receiving either nelfinavir or efavirenz were identified through our electronic database search, but only 126 patients met the inclusion criteria. Patients were included if they were older than 18 years old, naive for antiretroviral therapy, and had at least 1 follow-up visit after starting the antiretroviral regimen. Fifty-one of the included patients were receiving a nelfinavir-based regimen and 75 an efavirenz-based regimen as outpatients. Antiretroviral therapy was prescribed to all patients according to current guidelines. By intention-to-treat (missing/switch = failure), after a 12-month period, 65% of the patients in the efavirenz group reached a viral load <400 copies/mL compared to 41% of the patients in the nelfinavir group (P = 0.01). The mean CD4 cell count increase after a 12-month period was also greater in the efavirenz group (195 x 10(6) cells/L) than in the nelfinavir group (119 x 10(6) cells/L; P = 0.002). The efavirenz-based regimen was superior compared to the nelfinavir-based regimen. The low response rate in the nelfinavir group might be partially explained by the difficulty of using a regimen requiring a higher patient compliance (12 vs 3 pills a day) in a developing country.
Resumo:
We investigated the prognostic effects of high-flux hemodialysis (HFHD) and low-flux hemodialysis (LFHD) in patients with chronic kidney disease (CKD). Both an electronic and a manual search were performed based on our rigorous inclusion and exclusion criteria to retrieve high-quality, relevant clinical studies from various scientific literature databases. Comprehensive meta-analysis 2.0 (CMA 2.0) was used for the quantitative analysis. We initially retrieved 227 studies from the database search. Following a multi-step screening process, eight high-quality studies were selected for our meta-analysis. These eight studies included 4967 patients with CKD (2416 patients in the HFHD group, 2551 patients in the LFHD group). The results of our meta-analysis showed that the all-cause death rate in the HFHD group was significantly lower than that in the LFHD group (OR=0.704, 95%CI=0.533-0.929, P=0.013). Additionally, the cardiovascular death rate in the HFHD group was significantly lower than that in the LFHD group (OR=0.731, 95%CI=0.616-0.866, P<0.001). The results of this meta-analysis clearly showed that HFHD decreases all-cause death and cardiovascular death rates in patients with CKD and that HFHD can therefore be implemented as one of the first therapy choices for CKD.
Resumo:
We employ the theory of rational choice to examine whether observable choices from feasible sets of prospects can be generated by the optimization of some underlying decision criterion under uncertainty. Rather than focusing on a specific theory of choice, our objective is to formulate a general approach that is designed to cover the various decision criteria that have been proposed in the literature. We use a mild dominance property to define a class of suitable choice criteria. In addition to rationalizability per se, we characterize transitive and Suzumura consistent rationalizability in the presence of dominance.
Resumo:
De récents développements en théorie de la decision ont largement enrichi notre connaissance de la notion d'incertitude knightienne, usuellement appelée ambiguïté. Néanmoins ces dévelopement tardent à être intégrés au coeur de la théorie économique. Nous suggérons que l'analyse de phénonèmes économiques tel que l'innovation et la Recherche et Développement gagnerait à intégrer les modèles de décision en situation d'ambiguïté. Nous étayons notre propos en analysant l'allocation des droits de propriété d'une découverte. Les deux premières parties de la présentation s'inspire d'un modèle d'Aghion et de Tirole, The Management of Innovation, portant sur l'allocation des droits de propriété entre une unité de recherche et un investisseur. Il est démontré qu'un désaccord entre les agents sur la technologie de recherche affecte leur niveau d'effort, l'allocation des droits de propriété et l'allocation des revenus subséquents. Finalement, nous examinons une situation où plusieurs chercheurs sont en compétition en s'inspirant du traitement de l'incertitude de Savage. La présence d'ambuïgité affecte le comportement des agents et l'allocation des droits de propriétés de manière qui n'est pas captée en assumant l'hypothèse de risque.
Resumo:
The characterization and grading of glioma tumors, via image derived features, for diagnosis, prognosis, and treatment response has been an active research area in medical image computing. This paper presents a novel method for automatic detection and classification of glioma from conventional T2 weighted MR images. Automatic detection of the tumor was established using newly developed method called Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA).Statistical Features were extracted from the detected tumor texture using first order statistics and gray level co-occurrence matrix (GLCM) based second order statistical methods. Statistical significance of the features was determined by t-test and its corresponding p-value. A decision system was developed for the grade detection of glioma using these selected features and its p-value. The detection performance of the decision system was validated using the receiver operating characteristic (ROC) curve. The diagnosis and grading of glioma using this non-invasive method can contribute promising results in medical image computing