9 resultados para Marginal
em Cambridge University Engineering Department Publications Database
Resumo:
Marginal utility theory prescribes the relationship between the objective property of the magnitude of rewards and their subjective value. Despite its pervasive influence, however, there is remarkably little direct empirical evidence for such a theory of value, let alone of its neurobiological basis. We show that human preferences in an intertemporal choice task are best described by a model that integrates marginally diminishing utility with temporal discounting. Using functional magnetic resonance imaging, we show that activity in the dorsal striatum encodes both the marginal utility of rewards, over and above that which can be described by their magnitude alone, and the discounting associated with increasing time. In addition, our data show that dorsal striatum may be involved in integrating subjective valuation systems inherent to time and magnitude, thereby providing an overall metric of value used to guide choice behavior. Furthermore, during choice, we show that anterior cingulate activity correlates with the degree of difficulty associated with dissonance between value and time. Our data support an integrative architecture for decision making, revealing the neural representation of distinct subcomponents of value that may contribute to impulsivity and decisiveness.
Resumo:
In this paper we address the problem of the separation and recovery of convolutively mixed autoregressive processes in a Bayesian framework. Solving this problem requires the ability to solve integration and/or optimization problems of complicated posterior distributions. We thus propose efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) methods. We present three algorithms. The first one is a classical Gibbs sampler that generates samples from the posterior distribution. The two other algorithms are stochastic optimization algorithms that allow to optimize either the marginal distribution of the sources, or the marginal distribution of the parameters of the sources and mixing filters, conditional upon the observation. Simulations are presented.
Resumo:
We provide a comprehensive overview of many recent algorithms for approximate inference in Gaussian process models for probabilistic binary classification. The relationships between several approaches are elucidated theoretically, and the properties of the different algorithms are corroborated by experimental results. We examine both 1) the quality of the predictive distributions and 2) the suitability of the different marginal likelihood approximations for model selection (selecting hyperparameters) and compare to a gold standard based on MCMC. Interestingly, some methods produce good predictive distributions although their marginal likelihood approximations are poor. Strong conclusions are drawn about the methods: The Expectation Propagation algorithm is almost always the method of choice unless the computational budget is very tight. We also extend existing methods in various ways, and provide unifying code implementing all approaches.