920 resultados para Bayesian statistical decision theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent publication in this journal [Neumann et al., Forensic Sci. Int. 212 (2011) 32-46] presented the results of a field study that revealed the data provided by the fingermarks not processed in a forensic science laboratory. In their study, the authors were interested in the usefulness of this additional data in order to determine whether such fingermarks would have been worth submitting to the fingermark processing workflow. Taking these ideas as a starting point, this communication here places the fingermark in its context of a case brought before a court, and examines the question of processing or not processing a fingermark from a decision-theoretic point of view. The decision-theoretic framework presented provides an answer to this question in the form of a quantified expression of the expected value of information (EVOI) associated with the processed fingermark, which can then be compared with the cost of processing the mark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes an approach to optimal design of phase II clinical trials using Bayesian decision theory. The method proposed extends that suggested by Stallard (1998, Biometrics54, 279–294) in which designs were obtained to maximize a gain function including the cost of drug development and the benefit from a successful therapy. Here, the approach is extended by the consideration of other potential therapies, the development of which is competing for the same limited resources. The resulting optimal designs are shown to have frequentist properties much more similar to those traditionally used in phase II trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural networks are statistical models and learning rules are estimators. In this paper a theory for measuring generalisation is developed by combining Bayesian decision theory with information geometry. The performance of an estimator is measured by the information divergence between the true distribution and the estimate, averaged over the Bayesian posterior. This unifies the majority of error measures currently in use. The optimal estimators also reveal some intricate interrelationships among information geometry, Banach spaces and sufficient statistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian decision theory is increasingly applied to support decision-making processes under environmental variability and uncertainty. Researchers from application areas like psychology and biomedicine have applied these techniques successfully. However, in the area of software engineering and speci?cally in the area of self-adaptive systems (SASs), little progress has been made in the application of Bayesian decision theory. We believe that techniques based on Bayesian Networks (BNs) are useful for systems that dynamically adapt themselves at runtime to a changing environment, which is usually uncertain. In this paper, we discuss the case for the use of BNs, speci?cally Dynamic Decision Networks (DDNs), to support the decision-making of self-adaptive systems. We present how such a probabilistic model can be used to support the decision making in SASs and justify its applicability. We have applied our DDN-based approach to the case of an adaptive remote data mirroring system. We discuss results, implications and potential bene?ts of the DDN to enhance the development and operation of self-adaptive systems, by providing mechanisms to cope with uncertainty and automatically make the best decision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work is to report on the development of a multi-criteria methodology to support the assessment and selection of an Information System (IS) framework in a business context. The objective is to select a technological partner that provides the engine to be the basis for the development of a customized application for shrinkage reduction on the supply chains management. Furthermore, the proposed methodology di ers from most of the ones previously proposed in the sense that 1) it provides the decision makers with a set of pre-defined criteria along with their description and suggestions on how to measure them and 2)it uses a continuous scale with two reference levels and thus no normalization of the valuations is required. The methodology here proposed is has been designed to be easy to understand and use, without a specific support of a decision making analyst.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The purpose of this study was to assess decision making in patients with multiple sclerosis (MS) at the earliest clinically detectable time point of the disease. METHODS: Patients with definite MS (n = 109) or with clinically isolated syndrome (CIS, n = 56), a disease duration of 3 months to 5 years, and no or only minor neurological impairment (Expanded Disability Status Scale [EDSS] score 0-2.5) were compared to 50 healthy controls using the Iowa Gambling Task (IGT). RESULTS: The performance of definite MS, CIS patients, and controls was comparable for the two main outcomes of the IGT (learning index: p = 0.7; total score: p = 0.6). The IGT learning index was influenced by the educational level and the co-occurrence of minor depression. CIS and MS patients developing a relapse during an observation period of 15 months dated from IGT testing demonstrated a lower learning index in the IGT than patients who had no exacerbation (p = 0.02). When controlling for age, gender and education, the difference between relapsing and non-relapsing patients was at the limit of significance (p = 0.06). CONCLUSION: Decision making in a task mimicking real life decisions is generally preserved in early MS patients as compared to controls. A possible consequence of MS relapsing activity in the impairment of decision making ability is also suspected in the early phase of MS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compositional random vectors are fundamental tools in the Bayesian analysis of categorical data.Many of the issues that are discussed with reference to the statistical analysis of compositionaldata have a natural counterpart in the construction of a Bayesian statistical model for categoricaldata.This note builds on the idea of cross-fertilization of the two areas recommended by Aitchison (1986)in his seminal book on compositional data. Particular emphasis is put on the problem of whatparameterization to use

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes to estimate the covariance matrix of stock returnsby an optimally weighted average of two existing estimators: the samplecovariance matrix and single-index covariance matrix. This method isgenerally known as shrinkage, and it is standard in decision theory andin empirical Bayesian statistics. Our shrinkage estimator can be seenas a way to account for extra-market covariance without having to specifyan arbitrary multi-factor structure. For NYSE and AMEX stock returns from1972 to 1995, it can be used to select portfolios with significantly lowerout-of-sample variance than a set of existing estimators, includingmulti-factor models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A theory is presented to explain the statistical properties of the growth of dye-laser radiation. Results are in agreement with recent experimental findings. The different roles of pump-noise intensity and correlation time are elucidated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactive Choice Aid (ICA) is a decision aid, introduced in this paper, that systematically assists consumers with online purchase decisions. ICA integrates aspects from prescriptive decision theory, insights from descriptive decision research, and practical considerations; thereby combining pre-existing best practices with novel features. Instead of imposing an objectively ideal but unnatural decision procedure on the user, ICA assists the natural process of human decision-making by providing explicit support for the execution of the user's decision strategies. The application contains an innovative feature for in-depth comparisons of alternatives through which users' importance ratings are elicited interactively and in a playful way. The usability and general acceptance of the choice aid was studied; results show that ICA is a promising contribution and provides insights that may further improve its usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, the objective is to demonstrate the effects of different decision styles on strategic decisions and likewise, on an organization. The technique that was presented in the study is based on the transformation of linguistic variables to numerical value intervals. In this model, the study benefits from fuzzy logic methodology and fuzzy numbers. This fuzzy methodology approach allows us to examine the relations between decision making styles and strategic management processes when there is uncertainty. The purpose is to provide results to companies that may help them to exercise the most appropriate decision making style for its different strategic management processes. The study is leaving more research topics for further studies that may be applied to other decision making areas within the strategic management process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.