82 resultados para Probability Pattern comparison Evaluation and interpretation

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This letter to the Editor comments on the article Practical relevance of pattern uniqueness in forensic science by P.T. Jayaprakash (Forensic Science International, in press).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cardiac magnetic resonance (CMR) is accepted as a method to assess suspected coronary artery disease (CAD). Nonetheless, invasive coronary angiography (CXA) combined or not with fractional flow reserve (FFR) remains the main diagnostic test to evaluate CAD. Little data exist on the economic impact of the use of these procedures in a population with a low to intermediate pre-test probability. Objective: To compare the costs of 3 decision strategies to revascularize a patient with suspected CAD: 1) strategy guided by CMR 2) hypothetical strategy guided by CXA-FFR, 3) hypothetical strategy guided by CXA alone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forensic scientists working in 12 state or private laboratories participated in collaborative tests to improve the reliability of the presentation of DNA data at trial. These tests were motivated in response to the growing criticism of the power of DNA evidence. The experts' conclusions in the tests are presented and discussed in the context of the Bayesian approach to interpretation. The use of a Bayesian approach and subjective probabilities in trace evaluation permits, in an easy and intuitive manner, the integration into the decision procedure of any revision of the measure of uncertainty in the light of new information. Such an integration is especially useful with forensic evidence. Furthermore, we believe that this probabilistic model is a useful tool (a) to assist scientists in the assessment of the value of scientific evidence, (b) to help jurists in the interpretation of judicial facts and (c) to clarify the respective roles of scientists and of members of the court. Respondents to the survey were reluctant to apply this methodology in the assessment of DNA evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the evaluation results of the methods submitted to Challenge US: Biometric Measurements from Fetal Ultrasound Images, a segmentation challenge held at the IEEE International Symposium on Biomedical Imaging 2012. The challenge was set to compare and evaluate current fetal ultrasound image segmentation methods. It consisted of automatically segmenting fetal anatomical structures to measure standard obstetric biometric parameters, from 2D fetal ultrasound images taken on fetuses at different gestational ages (21 weeks, 28 weeks, and 33 weeks) and with varying image quality to reflect data encountered in real clinical environments. Four independent sub-challenges were proposed, according to the objects of interest measured in clinical practice: abdomen, head, femur, and whole fetus. Five teams participated in the head sub-challenge and two teams in the femur sub-challenge, including one team who tackled both. Nobody attempted the abdomen and whole fetus sub-challenges. The challenge goals were two-fold and the participants were asked to submit the segmentation results as well as the measurements derived from the segmented objects. Extensive quantitative (region-based, distance-based, and Bland-Altman measurements) and qualitative evaluation was performed to compare the results from a representative selection of current methods submitted to the challenge. Several experts (three for the head sub-challenge and two for the femur sub-challenge), with different degrees of expertise, manually delineated the objects of interest to define the ground truth used within the evaluation framework. For the head sub-challenge, several groups produced results that could be potentially used in clinical settings, with comparable performance to manual delineations. The femur sub-challenge had inferior performance to the head sub-challenge due to the fact that it is a harder segmentation problem and that the techniques presented relied more on the femur's appearance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The public health burden of coronary artery disease (CAD) is important. Perfusion cardiac magnetic resonance (CMR) is generally accepted to detect and monitor CAD. Few studies have so far addressed its costs and costeffectiveness. Objectives: To compare in a large CMR registry the costs of a CMR-guided strategy vs two hypothetical invasive strategies for the diagnosis and the treatment of patients with suspected CAD. Methods: In 3'647 patients with suspected CAD included prospectively in the EuroCMR Registry (59 centers; 18 countries) costs were calculated for diagnostic examinations, revascularizations as well as for complication management over a 1-year follow-up. Patients with ischemia-positive CMR underwent an invasive X-ray coronary angiography (CXA) and revascularization at the discretion of the treating physician (=CMR+CXA strategy). Ischemia was found in 20.9% of patients and 17.4% of them were revascularized. In ischemia-negative patients by CMR, cardiac death and non-fatal myocardial infarctions occurred in 0.38%/y. In a hypothetical invasive arm the costs were calculated for an initial CXA followed by FFR testing in vessels with ≥50% diameter stenoses (=CXA+FFR strategy). To model this hypothetical arm, the same proportion of ischemic patients and outcome was assumed as for the CMR+CXA strategy. The coronary stenosis - FFR relationship reported in the literature was used to derive the proportion of patients with ≥50% diameter stenoses (Psten) in the study cohort. The costs of a CXA-only strategy were also calculated. Calculations were performed from a third payer perspective for the German, UK, Swiss, and US healthcare systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fecal calprotectin is a small protein released mainly by neutrophils. It is recognized as a reliable, easy and non-invasive biomarker of gastro-intestinal inflammation. Normal values vary with age, with higher cut-off values during the first year of life (<350 microg/g) than in children (<275 microg/g) or adults (<50 microg/g). Fecal calprotectin can be a useful tool in initial evaluation of recurrent abdominal pain, helping to distinguish between functional gastro-intestinal disorders, where it is normal, and inflammatory bowel disease (IBD). It is not a specific marker of IBD but is increased in other situations of gastro-intestinal inflammation. In patients with IBD, fecal calprotectin is used to monitor treatment response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was commissioned by the European Committee on Crime Problems at the Council of Europe to describe and discuss the standards used to asses the admissibility and appraisal of scientific evidence in various member countries. After documenting cases in which faulty forensic evidence seems to have played a critical role, the authors describe the legal foundations of the issues of admissibility and assessment of the probative value in the field of scientific evidence, contrasting criminal justice systems of accusatorial and inquisitorial tradition and the various risks that they pose in terms of equality of arms. Special attention is given to communication issues between lawyers and scientific experts. The authors eventually investigate possible ways of improving the system. Among these mechanisms, emphasis is put on the adoption of a common terminology for expressing the weight of evidence. It is also proposed to adopt an harmonized interpretation framework among forensic experts rooted in good practices of logical inference.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the numerous clinical syndromes observed after severe traumatic head injury, post-traumatic mutism is a disorder rarely reported in adults and not studied in any detail in children. We report seven children between the ages of 3 1/2 and 14 years who sustained severe head injury and developed post-traumatic mutism. We aim to give a precise clinical characterization of this disorder, discuss differential diagnosis and correlations with brain imaging and suggest its probable neurological substrate. After a coma lasting from 5 to 25 days, the seven patients who suffered from post-traumatic mutism went through a period of total absence of verbal production lasting from 5 to 94 days, associated with the recovery of non-verbal communication skills and emotional vocalization. During the first days after the recovery of speech, all patients were able to produce correct small sentences with a hypophonic and monotonous voice, moderate dysarthria, word finding difficulties but no signs of aphasia, and preserved oral comprehension. The neurological signs in the acute phase (III nerve paresis in three of seven patients, signs of autonomic dysfunctions in five of seven patients), the results of the brain imaging and the experimental animal data all suggest the involvement of mesencephalic structures as playing a key role in the aetiology of post-traumatic mutism.