920 resultados para Bayesian statistical decision theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of phase II single-arm clinical trials of a new drug is to determine whether it has sufficient promising activity to warrant its further development. For the last several years Bayesian statistical methods have been proposed and used. Bayesian approaches are ideal for earlier phase trials as they take into account information that accrues during a trial. Predictive probabilities are then updated and so become more accurate as the trial progresses. Suitable priors can act as pseudo samples, which make small sample clinical trials more informative. Thus patients have better chances to receive better treatments. The goal of this paper is to provide a tutorial for statisticians who use Bayesian methods for the first time or investigators who have some statistical background. In addition, real data from three clinical trials are presented as examples to illustrate how to conduct a Bayesian approach for phase II single-arm clinical trials with binary outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian statistics allow scientists to easily incorporate prior knowledge into their data analysis. Nonetheless, the sheer amount of computational power that is required for Bayesian statistical analyses has previously limited their use in genetics. These computational constraints have now largely been overcome and the underlying advantages of Bayesian approaches are putting them at the forefront of genetic data analysis in an increasing number of areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By eliminating the short range negative divergence of the Debye–Hückel pair distribution function, but retaining the exponential charge screening known to operate at large interparticle separation, the thermodynamic properties of one-component plasmas of point ions or charged hard spheres can be well represented even in the strong coupling regime. Predicted electrostatic free energies agree within 5% of simulation data for typical Coulomb interactions up to a factor of 10 times the average kinetic energy. Here, this idea is extended to the general case of a uniform ionic mixture, comprising an arbitrary number of components, embedded in a rigid neutralizing background. The new theory is implemented in two ways: (i) by an unambiguous iterative algorithm that requires numerical methods and breaks the symmetry of cross correlation functions; and (ii) by invoking generalized matrix inverses that maintain symmetry and yield completely analytic solutions, but which are not uniquely determined. The extreme computational simplicity of the theory is attractive when considering applications to complex inhomogeneous fluids of charged particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new complex network model is proposed which is founded on growth, with new connections being established proportionally to the current dynamical activity of each node, which can be understood as a generalization of the Barabasi-Albert static model. By using several topological measurements, as well as optimal multivariate methods (canonical analysis and maximum likelihood decision), we show that this new model provides, among several other theoretical kinds of networks including Watts-Strogatz small-world networks, the greatest compatibility with three real-world cortical networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using the functional integral formalism for the statistical generating functional in the statistical (finite temperature) quantum field theory, we prove the equivalence of many-photon Greens functions in the Duffin-Kennner-Petiau and Klein-Gordon-Fock statistical quantum field theories. As an illustration, we calculate the one-loop polarization operators in both theories and demonstrate their coincidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We contrast four distinct versions of the BCS-Bose statistical crossover theory according to the form assumed for the electron-number equation that accompanies the BCS gap equation. The four versions correspond to explicitly accounting for two-hole-(2h) as well as two-electron-(2e) Cooper pairs (CPs), or both in equal proportions, or only either kind. This follows from a recent generalization of the Bose-Einstein condensation (GBEC) statistical theory that includes not boson-boson interactions but rather 2e- and also (without loss of generality) 2h-CPs interacting with unpaired electrons and holes in a single-band model that is easily converted into a two-band model. The GBEC theory is essentially an extension of the Friedberg-Lee 1989 BEC theory of superconductors that excludes 2h-CPs. It can thus recover, when the numbers of 2h- and 2e-CPs in both BE-condensed and non-condensed states are separately equal, the BCS gap equation for all temperatures and couplings as well as the zero-temperature BCS (rigorous-upper-bound) condensation energy for all couplings. But ignoring either 2h- or 2e-CPs it can do neither. In particular, only half the BCS condensation energy is obtained in the two crossover versions ignoring either kind of CPs. We show how critical temperatures T-c from the original BCS-Bose crossover theory in 2D require unphysically large couplings for the Cooper/BCS model interaction to differ significantly from the T(c)s of ordinary BCS theory (where the number equation is substituted by the assumption that the chemical potential equals the Fermi energy). (c) 2007 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(.) about one or more uncertain quantities to represent a person's knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite et al. (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(.), instead we use nonparametric Bayesian inference, modelling f(.) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(.). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The second main cause of death in Brazil is cancer, and according to statistics disclosed by National Cancer Institute from Brazil (INCA) 466,730 new cases of cancer are forecast for 2008. The analysis of tumour tissues of various types and patients' clinical data, genetic profiles, characteristics of diseases and epidemiological data may lead to more precise diagnoses, providing more effective treatments. In this work we present a clinical decision support system for cancer diseases, which manages a relational database containing information relating to the tumour tissue and their location in freezers, patients and medical forms. Furthermore, it is also discussed some problems encountered, as database integration and the adoption of a standard to describe topography and morphology. It is also discussed the dynamic report generation functionality, that shows data in table and graph format, according to the user's configuration. © ACM 2008.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical modelling and statistical learning theory are two powerful analytical frameworks for analyzing signals and developing efficient processing and classification algorithms. In this thesis, these frameworks are applied for modelling and processing biomedical signals in two different contexts: ultrasound medical imaging systems and primate neural activity analysis and modelling. In the context of ultrasound medical imaging, two main applications are explored: deconvolution of signals measured from a ultrasonic transducer and automatic image segmentation and classification of prostate ultrasound scans. In the former application a stochastic model of the radio frequency signal measured from a ultrasonic transducer is derived. This model is then employed for developing in a statistical framework a regularized deconvolution procedure, for enhancing signal resolution. In the latter application, different statistical models are used to characterize images of prostate tissues, extracting different features. These features are then uses to segment the images in region of interests by means of an automatic procedure based on a statistical model of the extracted features. Finally, machine learning techniques are used for automatic classification of the different region of interests. In the context of neural activity signals, an example of bio-inspired dynamical network was developed to help in studies of motor-related processes in the brain of primate monkeys. The presented model aims to mimic the abstract functionality of a cell population in 7a parietal region of primate monkeys, during the execution of learned behavioural tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis tackles the problem of the automated detection of the atmospheric boundary layer (BL) height, h, from aerosol lidar/ceilometer observations. A new method, the Bayesian Selective Method (BSM), is presented. It implements a Bayesian statistical inference procedure which combines in an statistically optimal way different sources of information. Firstly atmospheric stratification boundaries are located from discontinuities in the ceilometer back-scattered signal. The BSM then identifies the discontinuity edge that has the highest probability to effectively mark the BL height. Information from the contemporaneus physical boundary layer model simulations and a climatological dataset of BL height evolution are combined in the assimilation framework to assist this choice. The BSM algorithm has been tested for four months of continuous ceilometer measurements collected during the BASE:ALFA project and is shown to realistically diagnose the BL depth evolution in many different weather conditions. Then the BASE:ALFA dataset is used to investigate the boundary layer structure in stable conditions. Functions from the Obukhov similarity theory are used as regression curves to fit observed velocity and temperature profiles in the lower half of the stable boundary layer. Surface fluxes of heat and momentum are best-fitting parameters in this exercise and are compared with what measured by a sonic anemometer. The comparison shows remarkable discrepancies, more evident in cases for which the bulk Richardson number turns out to be quite large. This analysis supports earlier results, that surface turbulent fluxes are not the appropriate scaling parameters for profiles of mean quantities in very stable conditions. One of the practical consequences is that boundary layer height diagnostic formulations which mainly rely on surface fluxes are in disagreement to what obtained by inspecting co-located radiosounding profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we discuss a representation of quantum mechanics and quantum and statistical field theory based on a functional renormalization flow equation for the one-particle-irreducible average effective action, and we employ it to get information on some specific systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decision strategies aim at enabling reasonable decisions in cases of uncertain policy decision problems which do not meet the conditions for applying standard decision theory. This paper focuses on decision strategies that account for uncertainties by deciding whether a proposed list of policy options should be accepted or revised (scope strategies) and whether to decide now or later (timing strategies). They can be used in participatory approaches to structure the decision process. As a basis, we propose to classify the broad range of uncertainties affecting policy decision problems along two dimensions, source of uncertainty (incomplete information, inherent indeterminacy and unreliable information) and location of uncertainty (information about policy options, outcomes and values). Decision strategies encompass multiple and vague criteria to be deliberated in application. As an example, we discuss which decision strategies may account for the uncertainties related to nutritive technologies that aim at reducing methane (CH4) emissions from ruminants as a means of mitigating climate change, limiting our discussion to published scientific information. These considerations not only speak in favour of revising rather than accepting the discussed list of options, but also in favour of active postponement or semi-closure of decision-making rather than closure or passive postponement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Group sequential methods and response adaptive randomization (RAR) procedures have been applied in clinical trials due to economical and ethical considerations. Group sequential methods are able to reduce the average sample size by inducing early stopping, but patients are equally allocated with half of chance to inferior arm. RAR procedures incline to allocate more patients to better arm; however it requires more sample size to obtain a certain power. This study intended to combine these two procedures. We applied the Bayesian decision theory approach to define our group sequential stopping rules and evaluated the operating characteristics under RAR setting. The results showed that Bayesian decision theory method was able to preserve the type I error rate as well as achieve a favorable power; further by comparing with the error spending function method, we concluded that Bayesian decision theory approach was more effective on reducing average sample size.^