980 resultados para Approximate Bayesian Computation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis progresses Bayesian experimental design by developing novel methodologies and extensions to existing algorithms. Through these advancements, this thesis provides solutions to several important and complex experimental design problems, many of which have applications in biology and medicine. This thesis consists of a series of published and submitted papers. In the first paper, we provide a comprehensive literature review on Bayesian design. In the second paper, we discuss methods which may be used to solve design problems in which one is interested in finding a large number of (near) optimal design points. The third paper presents methods for finding fully Bayesian experimental designs for nonlinear mixed effects models, and the fourth paper investigates methods to rapidly approximate the posterior distribution for use in Bayesian utility functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provides an accessible foundation to Bayesian analysis using real world models This book aims to present an introduction to Bayesian modelling and computation, by considering real case studies drawn from diverse fields spanning ecology, health, genetics and finance. Each chapter comprises a description of the problem, the corresponding model, the computational method, results and inferences as well as the issues that arise in the implementation of these approaches. Case Studies in Bayesian Statistical Modelling and Analysis: •Illustrates how to do Bayesian analysis in a clear and concise manner using real-world problems. •Each chapter focuses on a real-world problem and describes the way in which the problem may be analysed using Bayesian methods. •Features approaches that can be used in a wide area of application, such as, health, the environment, genetics, information science, medicine, biology, industry and remote sensing. Case Studies in Bayesian Statistical Modelling and Analysis is aimed at statisticians, researchers and practitioners who have some expertise in statistical modelling and analysis, and some understanding of the basics of Bayesian statistics, but little experience in its application. Graduate students of statistics and biostatistics will also find this book beneficial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ship seakeeping operability refers to the quantification of motion performance in waves relative to mission requirements. This is used to make decisions about preferred vessel designs, but it can also be used as comprehensive assessment of the benefits of ship-motion-control systems. Traditionally, operability computation aggregates statistics of motion computed over over the envelope of likely environmental conditions in order to determine a coefficient in the range from 0 to 1 called operability. When used for assessment of motion-control systems, the increase of operability is taken as the key performance indicator. The operability coefficient is often given the interpretation of the percentage of time operable. This paper considers an alternative probabilistic approach to this traditional computation of operability. It characterises operability not as a number to which a frequency interpretation is attached, but as a hypothesis that a vessel will attain the desired performance in one mission considering the envelope of likely operational conditions. This enables the use of Bayesian theory to compute the probability of that this hypothesis is true conditional on data from simulations. Thus, the metric considered is the probability of operability. This formulation not only adheres to recent developments in reliability and risk analysis, but also allows incorporating into the analysis more accurate descriptions of ship-motion-control systems since the analysis is not limited to linear ship responses in the frequency domain. The paper also discusses an extension of the approach to the case of assessment of increased levels of autonomy for unmanned marine craft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A technique for computing the spectral and angular (both the zenith and azimuthal) distribution of the solar energy reaching the surface of earth and any other plane in the atmosphere has been developed. Here the computer code LOWTRAN is used for getting the atmospheric transmittances in conjunction with two approximate procedures: one based on the Eddington method and the other on van de Hulst's adding method, for solving the equation of radiative transfer to obtain the diffuse radiation in the cloud-free situation. The aerosol scattering phase functions are approximated by the Hyeney-Greenstein functions. When the equation of radiative transfer is solved using the adding method, the azimuthal and zenith angle dependence of the scattered radiation is evaluated, whereas when the Eddington technique is utilized only the total downward flux of scattered solar radiation is obtained. Results of the diffuse and beam components of solar radiation received on surface of earth compare very well with those computed by other methods such as the more exact calculations using spherical harmonics and when atmospheric conditions corresponding to that prevailing locally in a tropical location (as in India) are used as inputs the computed values agree closely with the measured values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note is concerned with the problem of determining approximate solutions of Fredholm integral equations of the second kind. Approximating the solution of a given integral equation by means of a polynomial, an over-determined system of linear algebraic equations is obtained involving the unknown coefficients, which is finally solved by using the least-squares method. Several examples are examined in detail. (c) 2009 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis the use of the Bayesian approach to statistical inference in fisheries stock assessment is studied. The work was conducted in collaboration of the Finnish Game and Fisheries Research Institute by using the problem of monitoring and prediction of the juvenile salmon population in the River Tornionjoki as an example application. The River Tornionjoki is the largest salmon river flowing into the Baltic Sea. This thesis tackles the issues of model formulation and model checking as well as computational problems related to Bayesian modelling in the context of fisheries stock assessment. Each article of the thesis provides a novel method either for extracting information from data obtained via a particular type of sampling system or for integrating the information about the fish stock from multiple sources in terms of a population dynamics model. Mark-recapture and removal sampling schemes and a random catch sampling method are covered for the estimation of the population size. In addition, a method for estimating the stock composition of a salmon catch based on DNA samples is also presented. For most of the articles, Markov chain Monte Carlo (MCMC) simulation has been used as a tool to approximate the posterior distribution. Problems arising from the sampling method are also briefly discussed and potential solutions for these problems are proposed. Special emphasis in the discussion is given to the philosophical foundation of the Bayesian approach in the context of fisheries stock assessment. It is argued that the role of subjective prior knowledge needed in practically all parts of a Bayesian model should be recognized and consequently fully utilised in the process of model formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a method to compute a probably approximately correct (PAC) normalized histogram of observations with a refresh rate of Theta(1) time units per histogram sample on a random geometric graph with noise-free links. The delay in computation is Theta(root n) time units. We further extend our approach to a network with noisy links. While the refresh rate remains Theta(1) time units per sample, the delay increases to Theta(root n log n). The number of transmissions in both cases is Theta(n) per histogram sample. The achieved Theta(1) refresh rate for PAC histogram computation is a significant improvement over the refresh rate of Theta(1/log n) for histogram computation in noiseless networks. We achieve this by operating in the supercritical thermodynamic regime where large pathways for communication build up, but the network may have more than one component. The largest component however will have an arbitrarily large fraction of nodes in order to enable approximate computation of the histogram to the desired level of accuracy. Operation in the supercritical thermodynamic regime also reduces energy consumption. A key step in the proof of our achievability result is the construction of a connected component having bounded degree and any desired fraction of nodes. This construction may also prove useful in other communication settings on the random geometric graph.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an algorithm for generating the Interior Medial Axis Transform (iMAT) of 3D objects with free-form boundaries. The algorithm proposed uses the exact representation of the part and generates an approximate rational spline description of the iMAT. The algorithm generates the iMAT by a tracing technique that marches along the object's boundary. The level of approximation is controlled by the choice of the step size in the tracing procedure. Criteria based on distance and local curvature of boundary entities are used to identify the junction points and the search for these junction points is done in an efficient way. The algorithm works for multiply-connected objects as well. Results of the implementation are provided. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Nearest Neighbour Field maps are commonly used by computer vision and graphics community to deal with problems like image completion, retargetting, denoising, etc. In this paper, we extend the scope of usage of ANNF maps to medical image analysis, more specifically to optic disk detection in retinal images. In the analysis of retinal images, optic disk detection plays an important role since it simplifies the segmentation of optic disk and other retinal structures. The proposed approach uses FeatureMatch, an ANNF algorithm, to find the correspondence between a chosen optic disk reference image and any given query image. This correspondence provides a distribution of patches in the query image that are closest to patches in the reference image. The likelihood map obtained from the distribution of patches in query image is used for optic disk detection. The proposed approach is evaluated on five publicly available DIARETDB0, DIARETDB1, DRIVE, STARE and MESSIDOR databases, with total of 1540 images. We show, experimentally, that our proposed approach achieves an average detection accuracy of 99% and an average computation time of 0.2 s per image. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the problem of energy-based, Bayesian spectrum sensing in cognitive radios under various fading environments. Under the well-known central limit theorem based model for energy detection, we derive analytically tractable expressions for near-optimal detection thresholds that minimize the probability of error under lognormal, Nakagami-m, and Weibull fading. For the Suzuki fading case, a generalized gamma approximation is provided, which saves on the computation of an integral. In each case, the accuracy of the theoretical expressions as compared to the optimal thresholds are illustrated through simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semi-weight function method is developed to solve the plane problem of two bonded dissimilar materials containing a crack along the bond. From equilibrium equation, stress and strain relationship, conditions of continuity across interface and free crack surface, the stress and displacement fields were obtained. The eigenvalue of these fields is lambda. Semi-weight functions were obtained as virtual displacement and stress fields with eigenvalue-lambda. Integral expression of fracture parameters, K-I and K-II, were obtained from reciprocal work theorem with semi-weight functions and approximate displacement and stress values on any integral path around crack tip. The calculation results of applications show that the semi-weight function method is a simple, convenient and high precision calculation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on the sub-region generalized variational principle, a sub-region mixed version of the newly-developed semi-analytical 'finite element method of lines' (FEMOL) is proposed in this paper for accurate and efficient computation of stress intensity factors (SIFs) of two-dimensional notches/cracks. The circular regions surrounding notch/crack tips are taken as the complementary energy region in which a number of leading terms of singular solutions for stresses are used, with the sought SIFs being among the unknown coefficients. The rest of the arbitrary domain is taken as the potential energy region in which FEMOL is applied to obtain approximate displacements. A mixed system of ordinary differential equations (ODEs) and algebraic equations is derived via the sub-region generalized variational principle. A singularity removal technique that eliminates the stress parameters from the mixed equation system eventually yields a standard FEMOL ODE system, the solution of which is no longer singular and is simply and efficiently obtained using a standard general-purpose ODE solver. A number of numerical examples, including bi-material notches/cracks in anti-plane and plane elasticity, are given to show the generally excellent performance of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential Monte Carlo (SMC) methods are popular computational tools for Bayesian inference in non-linear non-Gaussian state-space models. For this class of models, we propose SMC algorithms to compute the score vector and observed information matrix recursively in time. We propose two different SMC implementations, one with computational complexity $\mathcal{O}(N)$ and the other with complexity $\mathcal{O}(N^{2})$ where $N$ is the number of importance sampling draws. Although cheaper, the performance of the $\mathcal{O}(N)$ method degrades quickly in time as it inherently relies on the SMC approximation of a sequence of probability distributions whose dimension is increasing linearly with time. In particular, even under strong \textit{mixing} assumptions, the variance of the estimates computed with the $\mathcal{O}(N)$ method increases at least quadratically in time. The $\mathcal{O}(N^{2})$ is a non-standard SMC implementation that does not suffer from this rapid degrade. We then show how both methods can be used to perform batch and recursive parameter estimation.