75 resultados para Native Vegetation Condition, Benchmarking, Bayesian Decision Framework, Regression, Indicators


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent publication in this journal [Neumann et al., Forensic Sci. Int. 212 (2011) 32-46] presented the results of a field study that revealed the data provided by the fingermarks not processed in a forensic science laboratory. In their study, the authors were interested in the usefulness of this additional data in order to determine whether such fingermarks would have been worth submitting to the fingermark processing workflow. Taking these ideas as a starting point, this communication here places the fingermark in its context of a case brought before a court, and examines the question of processing or not processing a fingermark from a decision-theoretic point of view. The decision-theoretic framework presented provides an answer to this question in the form of a quantified expression of the expected value of information (EVOI) associated with the processed fingermark, which can then be compared with the cost of processing the mark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What determines the share of public employment, at a given size of the State, in countries of similar levels of economic development? While the theoretical and empirical literature on this issue has mostly considered technical dimensions (efficiency and political considerations), this paper emphasizes the role of culture and quantifies it. We build a representative database for contracting choices of municipalities in Switzerland and exploit the discontinuity at the Swiss language border at identical actual set of policies and institutions to analyze the causal e↵ect of culture on the choice of how public services are provided. We find that French-speaking border municipalities are 50% less likely to contract with the private sector than their German-speaking adjacent municipalities. Technical dimensions are much smaller by comparison. This result points out that culture is a source of a potential bias that distorts the optimal choice for public service delivery. Systematic differences in the level of confidence in public administration and private companies potentially explain this discrepancy in private sector participation in public services provision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Species distribution models (SDMs) are increasingly proposed to support conservation decision making. However, evidence of SDMs supporting solutions for on-ground conservation problems is still scarce in the scientific literature. Here, we show that successful examples exist but are still largely hidden in the grey literature, and thus less accessible for analysis and learning. Furthermore, the decision framework within which SDMs are used is rarely made explicit. Using case studies from biological invasions, identification of critical habitats, reserve selection and translocation of endangered species, we propose that SDMs may be tailored to suit a range of decision-making contexts when used within a structured and transparent decision-making process. To construct appropriate SDMs to more effectively guide conservation actions, modellers need to better understand the decision process, and decision makers need to provide feedback to modellers regarding the actual use of SDMs to support conservation decisions. This could be facilitated by individuals or institutions playing the role of 'translators' between modellers and decision makers. We encourage species distribution modellers to get involved in real decision-making processes that will benefit from their technical input; this strategy has the potential to better bridge theory and practice, and contribute to improve both scientific knowledge and conservation outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical methods combined with measurements of single-cell dynamics provide a means to reconstruct intracellular processes that are only partly or indirectly accessible experimentally. To obtain reliable reconstructions, the pooling of measurements from several cells of a clonal population is mandatory. However, cell-to-cell variability originating from diverse sources poses computational challenges for such process reconstruction. We introduce a scalable Bayesian inference framework that properly accounts for population heterogeneity. The method allows inference of inaccessible molecular states and kinetic parameters; computation of Bayes factors for model selection; and dissection of intrinsic, extrinsic and technical noise. We show how additional single-cell readouts such as morphological features can be included in the analysis. We use the method to reconstruct the expression dynamics of a gene under an inducible promoter in yeast from time-lapse microscopy data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial investment in climate change research has led to dire predictions of the impacts and risks to biodiversity. The Intergovernmental Panel on Climate Change fourth assessment report(1) cites 28,586 studies demonstrating significant biological changes in terrestrial systems(2). Already high extinction rates, driven primarily by habitat loss, are predicted to increase under climate change(3-6). Yet there is little specific advice or precedent in the literature to guide climate adaptation investment for conserving biodiversity within realistic economic constraints(7). Here we present a systematic ecological and economic analysis of a climate adaptation problem in one of the world's most species-rich and threatened ecosystems: the South African fynbos. We discover a counterintuitive optimal investment strategy that switches twice between options as the available adaptation budget increases. We demonstrate that optimal investment is nonlinearly dependent on available resources, making the choice of how much to invest as important as determining where to invest and what actions to take. Our study emphasizes the importance of a sound analytical framework for prioritizing adaptation investments(4). Integrating ecological predictions in an economic decision framework will help support complex choices between adaptation options under severe uncertainty. Our prioritization method can be applied at any scale to minimize species loss and to evaluate the robustness of decisions to uncertainty about key assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in flow cytometry and other single-cell technologies have enabled high-dimensional, high-throughput measurements of individual cells as well as the interrogation of cell population heterogeneity. However, in many instances, computational tools to analyze the wealth of data generated by these technologies are lacking. Here, we present a computational framework for unbiased combinatorial polyfunctionality analysis of antigen-specific T-cell subsets (COMPASS). COMPASS uses a Bayesian hierarchical framework to model all observed cell subsets and select those most likely to have antigen-specific responses. Cell-subset responses are quantified by posterior probabilities, and human subject-level responses are quantified by two summary statistics that describe the quality of an individual's polyfunctional response and can be correlated directly with clinical outcome. Using three clinical data sets of cytokine production, we demonstrate how COMPASS improves characterization of antigen-specific T cells and reveals cellular 'correlates of protection/immunity' in the RV144 HIV vaccine efficacy trial that are missed by other methods. COMPASS is available as open-source software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The history of biodiversity is characterized by a continual replacement of branches in the tree of life. The rise and demise of these branches (clades) are ultimately determined by changes in speciation and extinction rates, often interpreted as a response to varying abiotic and biotic factors. However, understanding the relative importance of these factors remains a major challenge in evolutionary biology. Here we analyze the rich North American fossil record of the dog family Canidae and of other carnivores to tease apart the roles of competition, body size evolution, and climate change on the sequential replacement of three canid subfamilies (two of which have gone extinct). We develop a novel Bayesian analytic framework to show that competition from multiple carnivore clades successively drove the demise and replacement of the two extinct canid subfamilies by increasing their extinction rates and suppressing their speciation. Competitive effects have likely come from ecologically similar species from both canid and felid clades. These results imply that competition among entire clades, generally considered a rare process, can play a more substantial role than climate change and body size evolution in determining the sequential rise and decline of clades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Despite growing interest in measurement of health care quality and patient experience, the current evidence base largely derives from adult health settings, at least in part because of the absence of appropriately developed measurement tools for adolescents. To rectify this, we set out to develop a conceptual framework and a set of indicators to measure the quality of health care delivered to adolescents in hospital. METHODS: A conceptual framework was developed from the following four elements: (1) a review of the evidence around what young people perceive as "adolescent-friendly" health care; (2) an exploration with adolescent patients of the principles of patient-centered care; (3) a scoping review to identify core clinical practices around working with adolescents; and (4) a scoping review of existing conceptual frameworks. Using criteria for indicator development, we then developed a set of indicators that mapped to this framework. RESULTS: Embedded within the notion of patient- and family-centered care, the conceptual framework for adolescent-friendly health care (quality health care for adolescents) was based on the constructs of experience of care (positive engagement with health care) and evidence-informed care. A set of 14 indicators was developed, half of which related to adolescents' and parents' experience of care and half of which related to aspects of evidence-informed care. CONCLUSIONS: The conceptual framework and indicators of quality health care for adolescents set the stage to develop measures to populate these indicators, the next step in the agenda of improving the quality of health care delivered to adolescents in hospital settings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.