62 resultados para Database search Evidential value Bayesian decision theory Influence diagrams
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
Resume : L'utilisation de l'encre comme indice en sciences forensiques est décrite et encadrée par une littérature abondante, comprenant entre autres deux standards de l'American Society for Testing and Materials (ASTM). La grande majorité de cette littérature se préoccupe de l'analyse des caractéristiques physiques ou chimiques des encres. Les standards ASTM proposent quelques principes de base qui concernent la comparaison et l'interprétation de la valeur d'indice des encres en sciences forensiques. L'étude de cette littérature et plus particulièrement des standards ASTM, en ayant a l'esprit les développements intervenus dans le domaine de l'interprétation de l'indice forensique, montre qu'il existe un potentiel certain pour l'amélioration de l'utilisation de l'indice encre et de son impact dans l'enquête criminelle. Cette thèse propose d'interpréter l'indice encre en se basant sur le cadre défini par le théorème de Bayes. Cette proposition a nécessité le développement d'un système d'assurance qualité pour l'analyse et la comparaison d'échantillons d'encre. Ce système d'assurance qualité tire parti d'un cadre théorique nouvellement défini. La méthodologie qui est proposée dans ce travail a été testée de manière compréhensive, en tirant parti d'un set de données spécialement créer pour l'occasion et d'outils importés de la biométrie. Cette recherche répond de manière convaincante à un problème concret généralement rencontré en sciences forensiques. L'information fournie par le criminaliste, lors de l'examen de traces, est souvent bridée, car celui-ci essaie de répondre à la mauvaise question. L'utilisation d'un cadre théorique explicite qui définit et formalise le goal de l'examen criminaliste, permet de déterminer les besoins technologiques et en matière de données. Le développement de cette technologie et la collection des données pertinentes peut être justifiées économiquement et achevée de manière scientifique. Abstract : The contribution of ink evidence to forensic science is described and supported by an abundant literature and by two standards from the American Society for Testing and Materials (ASTM). The vast majority of the available literature is concerned with the physical and chemical analysis of ink evidence. The relevant ASTM standards mention some principles regarding the comparison of pairs of ink samples and the evaluation of their evidential value. The review of this literature and, more specifically, of the ASTM standards in the light of recent developments in the interpretation of forensic evidence has shown some potential improvements, which would maximise the benefits of the use of ink evidence in forensic science. This thesis proposes to interpret ink evidence using the widely accepted and recommended Bayesian theorem. This proposition has required the development of a new quality assurance process for the analysis and comparison of ink samples, as well as of the definition of a theoretical framework for ink evidence. The proposed technology has been extensively tested using a large dataset of ink samples and state of the art tools, commonly used in biometry. Overall, this research successfully answers to a concrete problem generally encountered in forensic science, where scientists tend to self-limit the usefulness of the information that is present in various types of evidence, by trying to answer to the wrong questions. The declaration of an explicit framework, which defines and formalises their goals and expected contributions to the criminal and civil justice system, enables the determination of their needs in terms of technology and data. The development of this technology and the collection of the data is then justified economically, structured scientifically and can be proceeded efficiently.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
In this paper we propose an innovative methodology for automated profiling of illicit tablets bytheir surface granularity; a feature previously unexamined for this purpose. We make use of the tinyinconsistencies at the tablet surface, referred to as speckles, to generate a quantitative granularity profileof tablets. Euclidian distance is used as a measurement of (dis)similarity between granularity profiles.The frequency of observed distances is then modelled by kernel density estimation in order to generalizethe observations and to calculate likelihood ratios (LRs). The resulting LRs are used to evaluate thepotential of granularity profiles to differentiate between same-batch and different-batches tablets.Furthermore, we use the LRs as a similarity metric to refine database queries. We are able to derivereliable LRs within a scope that represent the true evidential value of the granularity feature. Thesemetrics are used to refine candidate hit-lists form a database containing physical features of illicittablets. We observe improved or identical ranking of candidate tablets in 87.5% of cases when granularityis considered.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Interactive Choice Aid (ICA) is a decision aid, introduced in this paper, that systematically assists consumers with online purchase decisions. ICA integrates aspects from prescriptive decision theory, insights from descriptive decision research, and practical considerations; thereby combining pre-existing best practices with novel features. Instead of imposing an objectively ideal but unnatural decision procedure on the user, ICA assists the natural process of human decision-making by providing explicit support for the execution of the user's decision strategies. The application contains an innovative feature for in-depth comparisons of alternatives through which users' importance ratings are elicited interactively and in a playful way. The usability and general acceptance of the choice aid was studied; results show that ICA is a promising contribution and provides insights that may further improve its usability.
Resumo:
The knowledge of the relationship that links radiation dose and image quality is a prerequisite to any optimization of medical diagnostic radiology. Image quality depends, on the one hand, on the physical parameters such as contrast, resolution, and noise, and on the other hand, on characteristics of the observer that assesses the image. While the role of contrast and resolution is precisely defined and recognized, the influence of image noise is not yet fully understood. Its measurement is often based on imaging uniform test objects, even though real images contain anatomical backgrounds whose statistical nature is much different from test objects used to assess system noise. The goal of this study was to demonstrate the importance of variations in background anatomy by quantifying its effect on a series of detection tasks. Several types of mammographic backgrounds and signals were examined by psychophysical experiments in a two-alternative forced-choice detection task. According to hypotheses concerning the strategy used by the human observers, their signal to noise ratio was determined. This variable was also computed for a mathematical model based on the statistical decision theory. By comparing theoretical model and experimental results, the way that anatomical structure is perceived has been analyzed. Experiments showed that the observer's behavior was highly dependent upon both system noise and the anatomical background. The anatomy partly acts as a signal recognizable as such and partly as a pure noise that disturbs the detection process. This dual nature of the anatomy is quantified. It is shown that its effect varies according to its amplitude and the profile of the object being detected. The importance of the noisy part of the anatomy is, in some situations, much greater than the system noise. Hence, reducing the system noise by increasing the dose will not improve task performance. This observation indicates that the tradeoff between dose and image quality might be optimized by accepting a higher system noise. This could lead to a better resolution, more contrast, or less dose.
Resumo:
BACKGROUND: Recent neuroimaging studies suggest that value-based decision-making may rely on mechanisms of evidence accumulation. However no studies have explicitly investigated the time when single decisions are taken based on such an accumulation process. NEW METHOD: Here, we outline a novel electroencephalography (EEG) decoding technique which is based on accumulating the probability of appearance of prototypical voltage topographies and can be used for predicting subjects' decisions. We use this approach for studying the time-course of single decisions, during a task where subjects were asked to compare reward vs. loss points for accepting or rejecting offers. RESULTS: We show that based on this new method, we can accurately decode decisions for the majority of the subjects. The typical time-period for accurate decoding was modulated by task difficulty on a trial-by-trial basis. Typical latencies of when decisions are made were detected at ∼500ms for 'easy' vs. ∼700ms for 'hard' decisions, well before subjects' response (∼340ms). Importantly, this decision time correlated with the drift rates of a diffusion model, evaluated independently at the behavioral level. COMPARISON WITH EXISTING METHOD(S): We compare the performance of our algorithm with logistic regression and support vector machine and show that we obtain significant results for a higher number of subjects than with these two approaches. We also carry out analyses at the average event-related potential level, for comparison with previous studies on decision-making. CONCLUSIONS: We present a novel approach for studying the timing of value-based decision-making, by accumulating patterns of topographic EEG activity at single-trial level.
Resumo:
This paper presents and discusses further aspects of the subjectivist interpretation of probability (also known as the 'personalist' view of probabilities) as initiated in earlier forensic and legal literature. It shows that operational devices to elicit subjective probabilities - in particular the so-called scoring rules - provide additional arguments in support of the standpoint according to which categorical claims of forensic individualisation do not follow from a formal analysis under that view of probability theory.
Resumo:
Abstract The neo-liberal capitalist ideology has come under heavy fire with anecdotal evidence indicating a link between these same values and unethical behavior. Academic institutions reflect social values and act as socializing agents for the young. Can this explain the high and increasing rates of cheating that currently prevail in education? Our first chapter examines the question of whether self-enhancement values of power and açhievement, the individual level equivalent of neo-liberal capitalist values, predict positive attitudes towards cheating. Furthermore, we explore the mediating role of motivational factors. Results of four studies reveal that self-enhancement value endorsement predicts the adoption of performance-approach goals, a relationship mediated by introjected regulation, namely desire for social approval and that self-enhancement value endorsement also predicts the condoning of cheating, a relationship mediated by performance-approach goal adoption. However, self-transcendence values prescribed by a normatively salient source have the potential to reduce the link between self-enhancement value endorsément and attitudes towards cheating. Normative assessment constitutes a key tool used by academic institutions to socialize young people to accept the competitive, meritocratic nature of a sociéty driven by a neo-liberal capitalist ideology. As such, the manifest function of grades is to motivate students to work hard and to buy into the competing ethos. Does normative assessment fulfill these functions? Our second chapter explores the reward-intrinsic motivation question in the context of grading, arguably a high-stakes reward. In two experiments, the relative capacity of graded high performance as compared to the task autonomy experienced in an ungraded task to predict post-task intrinsic motivation is assessed. Results show that whilst the graded task performance predicts post-task appreciation, it fails to predict ongoing motivation. However, perceived autonomy experienced in non-graded condition, predicts both post-task appreciation and ongoing motivation. Our third chapter asks whether normative assessment inspires the spirit of competition in students. Results of three experimental studies reveal that expectation of a grade for a task, compared to no grade, induces greater adoption of performance-avoidance, but not performance-approach, goals. Experiment 3 provides an explanatory mechanism for this, showing that reduced autonomous motivation experienced in previous graded tasks mediates the relationship between grading and adoption of performance avoidance goals in a subsequent task. The above results, when combined, provide evidence as to the deleterious effects of self enhancement values and the associated practice of normative assessment in school on student motivation, goals and ethics. We conclude by using value and motivation theory to explore solutions to this problem.
Resumo:
Purpose: Heterogeneous results of single studies with photodynamic diagnosis (PDD) in bladder cancer have been reported. A metaanalysis of prospective studies has now been performed. Material and Methods: The effect of PDD in addition to WLC on a) the diagnosis and b) the therapeutic outcome of primary or recurrent non-muscle invasive bladder cancer (NMIBC) investigated by cystoscopy or transurethral resection was analysed. An electronic database search was performed. Trials were included if they prospectively compared WLC with PDD in bladder cancer. Primary endpoints were additional detection rate, residual tumour at second resection and recurrence-free survival. Results: Significantly more tumour-positive patients were detected with PDD in all patients with non-muscle invasive tumours (= 20%) [95% confidence interval (CI): 8 to 35%] and in CIS patients (= 39%) (CI: 23 to 57%). Residual tumour was significantly less often found after PDD (odds ratio 0.28, CI: 0.15 to 0.52, p<0.0001). Recurrence-free survival was significantly higher at 12 and 24months in the PDD groups than in WLC only groups. Conclusions: More bladder tumour-positive patients are detected by PDD. Best results were found in CIS patients. Diagnosis with PDD results in a more complete resection and a longer recurrence-free survival.
Resumo:
Many classifiers achieve high levels of accuracy but have limited applicability in real world situations because they do not lead to a greater understanding or insight into the^way features influence the classification. In areas such as health informatics a classifier that clearly identifies the influences on classification can be used to direct research and formulate interventions. This research investigates the practical applications of Automated Weighted Sum, (AWSum), a classifier that provides accuracy comparable to other techniques whilst providing insight into the data. This is achieved by calculating a weight for each feature value that represents its influence on the class value. The merits of this approach in classification and insight are evaluated on a Cystic Fibrosis and Diabetes datasets with positive results.
Resumo:
A body weight lower than 90% of the optional value has an unfavorable influence on the prognosis of chronic obstructive pulmonary disease (COPD). Short term studies of up to three months duration have shown improved function of respiratory muscle exercise tolerance and immunologic parameters by an increased caloric intake of 45 kcal/kg body weight. In a randomized trial of twelve months 14 of 30 patients with an average FEV1 of 0.8 l were instructed to take a high calorie diet. For simplicity a part of the calories were administered as Fresubin, a fluid nutrient formula. Although a weight gain of 7 kg (p = 0.003) was obtained the difference to the control group was statistically not significant (p = 0.08). The same was true for skin fold thickness (12.4 vs 5.7 mm), change of ventilatory parameters and the 6 minute walking distance (-33 vs -86 m). Subjective improvement was, however, impressive in all patients with dietary intervention, explainable probably by increased attention. Dietary counselling for increased intake of calories, vitamins and also calcium is thus very important in the treatment of patients with COPD.
Resumo:
Toperform a meta-analysis of FDG-PET performances in the diagnosis of largevessels vasculitis (Giant Cell Arteritis (GCA) associated or not withPolymyalgia Rheumatica(PMR), Takayasu). Materials and methods : The MEDLINE,Cochrane Library, Embase were searched for relevant original articlesdescribing FDG-PET for vasculitis assessment, using MesH terms ("GiantCell Arteritis or Vasculitis" AND "PET"). Criteria for inclusionwere:(1)FDG-PET for diagnosis of vasculitis(2)American College of Rheumatologycriteria as reference standard(3)control group. After data extraction, analyseswere performed using a random-effects model. Results : Of 184 citations(database search and references screening),70 articles were reviewed of which12 eligible studies were extracted (sensitivity range from 32% to 97%). 7studies fulfilled all inclusion criteria. Owing to overlapping population, 1study was excluded. Statistical heterogeneity justified the random-effectsmodel. Pooled 6 studies analysis(116 vasculitis,224 controls) showed a 81%sensitivity (95%CI:70-89%);a 89% specificity (95%CI:77-95%);a 85%PPV(95%CI:63-95%); a 90% NPV(95%CI:79-95%);a 7.1 positive LR(95%CI:3.4-14.9); a0.2 negative LR(95%CI:0.14-0.35) and 90.1 DOR(95%CI: 18.6-437). Conclusion :FDG-PET has good diagnostic performances in the detection of large vesselsvasculitis. Its promising role could be extended to follow up patients undertreatment, but further studies are needed to confirm this possibility.