124 resultados para Likelihood function
em Cambridge University Engineering Department Publications Database
Resumo:
Approximate Bayesian computation (ABC) is a popular technique for analysing data for complex models where the likelihood function is intractable. It involves using simulation from the model to approximate the likelihood, with this approximate likelihood then being used to construct an approximate posterior. In this paper, we consider methods that estimate the parameters by maximizing the approximate likelihood used in ABC. We give a theoretical analysis of the asymptotic properties of the resulting estimator. In particular, we derive results analogous to those of consistency and asymptotic normality for standard maximum likelihood estimation. We also discuss how sequential Monte Carlo methods provide a natural method for implementing our likelihood-based ABC procedures.
Resumo:
Approximate Bayesian computation (ABC) has become a popular technique to facilitate Bayesian inference from complex models. In this article we present an ABC approximation designed to perform biased filtering for a Hidden Markov Model when the likelihood function is intractable. We use a sequential Monte Carlo (SMC) algorithm to both fit and sample from our ABC approximation of the target probability density. This approach is shown to, empirically, be more accurate w.r.t.~the original filter than competing methods. The theoretical bias of our method is investigated; it is shown that the bias goes to zero at the expense of increased computational effort. Our approach is illustrated on a constrained sequential lasso for portfolio allocation to 15 constituents of the FTSE 100 share index.
Resumo:
Electron multiplication charge-coupled devices (EMCCD) are widely used for photon counting experiments and measurements of low intensity light sources, and are extensively employed in biological fluorescence imaging applications. These devices have a complex statistical behaviour that is often not fully considered in the analysis of EMCCD data. Robust and optimal analysis of EMCCD images requires an understanding of their noise properties, in particular to exploit fully the advantages of Bayesian and maximum-likelihood analysis techniques, whose value is increasingly recognised in biological imaging for obtaining robust quantitative measurements from challenging data. To improve our own EMCCD analysis and as an effort to aid that of the wider bioimaging community, we present, explain and discuss a detailed physical model for EMCCD noise properties, giving a likelihood function for image counts in each pixel for a given incident intensity, and we explain how to measure the parameters for this model from various calibration images. © 2013 Hirsch et al.
Resumo:
This paper presents a Bayesian probabilistic framework to assess soil properties and model uncertainty to better predict excavation-induced deformations using field deformation data. The potential correlations between deformations at different depths are accounted for in the likelihood function needed in the Bayesian approach. The proposed approach also accounts for inclinometer measurement errors. The posterior statistics of the unknown soil properties and the model parameters are computed using the Delayed Rejection (DR) method and the Adaptive Metropolis (AM) method. As an application, the proposed framework is used to assess the unknown soil properties of multiple soil layers using deformation data at different locations and for incremental excavation stages. The developed approach can be used for the design of optimal revisions for supported excavation systems. © 2010 ASCE.
Resumo:
Vibration and acoustic analysis at higher frequencies faces two challenges: computing the response without using an excessive number of degrees of freedom, and quantifying its uncertainty due to small spatial variations in geometry, material properties and boundary conditions. Efficient models make use of the observation that when the response of a decoupled vibro-acoustic subsystem is sufficiently sensitive to uncertainty in such spatial variations, the local statistics of its natural frequencies and mode shapes saturate to universal probability distributions. This holds irrespective of the causes that underly these spatial variations and thus leads to a nonparametric description of uncertainty. This work deals with the identification of uncertain parameters in such models by using experimental data. One of the difficulties is that both experimental errors and modeling errors, due to the nonparametric uncertainty that is inherent to the model type, are present. This is tackled by employing a Bayesian inference strategy. The prior probability distribution of the uncertain parameters is constructed using the maximum entropy principle. The likelihood function that is subsequently computed takes the experimental information, the experimental errors and the modeling errors into account. The posterior probability distribution, which is computed with the Markov Chain Monte Carlo method, provides a full uncertainty quantification of the identified parameters, and indicates how well their uncertainty is reduced, with respect to the prior information, by the experimental data. © 2013 Taylor & Francis Group, London.
Resumo:
This paper introduces a new technique called species conservation for evolving parallel subpopulations. The technique is based on the concept of dividing the population into several species according to their similarity. Each of these species is built around a dominating individual called the species seed. Species seeds found in the current generation are saved (conserved) by moving them into the next generation. Our technique has proved to be very effective in finding multiple solutions of multimodal optimization problems. We demonstrate this by applying it to a set of test problems, including some problems known to be deceptive to genetic algorithms.
Resumo:
We demonstrate a parameter extraction algorithm based on a theoretical transfer function, which takes into account a converging THz beam. Using this, we successfully extract material parameters from data obtained for a quartz sample with a THz time domain spectrometer. © 2010 IEEE.
Resumo:
BACKGROUND: GABA(A) receptors are members of the Cys-loop family of neurotransmitter receptors, proteins which are responsible for fast synaptic transmission, and are the site of action of wide range of drugs. Recent work has shown that Cys-loop receptors are present on immune cells, but their physiological roles and the effects of drugs that modify their function in the innate immune system are currently unclear. We are interested in how and why anaesthetics increase infections in intensive care patients; a serious problem as more than 50% of patients with severe sepsis will die. As many anaesthetics act via GABA(A) receptors, the aim of this study was to determine if these receptors are present on immune cells, and could play a role in immunocompromising patients. PRINCIPAL FINDINGS: We demonstrate, using RT-PCR, that monocytes express GABA(A) receptors constructed of α1, α4, β2, γ1 and/or δ subunits. Whole cell patch clamp electrophysiological studies show that GABA can activate these receptors, resulting in the opening of a chloride-selective channel; activation is inhibited by the GABA(A) receptor antagonists bicuculline and picrotoxin, but not enhanced by the positive modulator diazepam. The anaesthetic drugs propofol and thiopental, which can act via GABA(A) receptors, impaired monocyte function in classic immunological chemotaxis and phagocytosis assays, an effect reversed by bicuculline and picrotoxin. SIGNIFICANCE: Our results show that functional GABA(A) receptors are present on monocytes with properties similar to CNS GABA(A) receptors. The functional data provide a possible explanation as to why chronic propofol and thiopental administration can increase the risk of infection in critically ill patients: their action on GABA(A) receptors inhibits normal monocyte behaviour. The data also suggest a potential solution: monocyte GABA(A) receptors are insensitive to diazepam, thus the use of benzodiazepines as an alternative anesthetising agent may be advantageous where infection is a life threatening problem.
Resumo:
We investigate how sensitive Gallager's codes are, when decoded by the sum-product algorithm, to the assumed noise level. We have found a remarkably simple function that fits the empirical results as a function of the actual noise level at both high and low noise levels. © 2004 Elsevier B.V.
Resumo:
We investigate how sensitive Gallager's codes are, when decoded by the sum-product algorithm, to the assumed noise level. We have found a remarkably simple function that fits the empirical results as a function of the actual noise level at both high and low noise levels. ©2003 Published by Elsevier Science B. V.
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets. Copyright 2009.
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finite-dimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.
Resumo:
The Vi capsular polysaccharide is a virulence-associated factor expressed by Salmonella enterica serotype Typhi but absent from virtually all other Salmonella serotypes. In order to study this determinant in vivo, we characterised a Vi-positive S. Typhimurium (C5.507 Vi(+)), harbouring the Salmonella pathogenicity island (SPI)-7, which encodes the Vi locus. S. Typhimurium C5.507 Vi(+) colonised and persisted in mice at similar levels compared to the parent strain, S. Typhimurium C5. However, the innate immune response to infection with C5.507 Vi(+) and SGB1, an isogenic derivative not expressing Vi, differed markedly. Infection with C5.507 Vi(+) resulted in a significant reduction in cellular trafficking of innate immune cells, including PMN and NK cells, compared to SGB1 Vi(-) infected animals. C5.507 Vi(+) infection stimulated reduced numbers of TNF-α, MIP-2 and perforin producing cells compared to SGB1 Vi(-). The modulating effect associated with Vi was not observed in MyD88(-/-) and was reduced in TLR4(-/-) mice. The presence of the Vi capsule also correlated with induction of the anti-inflammatory cytokine IL-10 in vivo, a factor that impacted on chemotaxis and the activation of immune cells in vitro.