12 resultados para probability distribution
em Cambridge University Engineering Department Publications Database
Resumo:
The vibro-acoustic response of built-up structures, consisting of stiff components with low modal density and flexible components with high modal density, is sensitive to small imperfections in the flexible components. In this paper, the uncertainty of the response is considered by modeling the low modal density master system as deterministic and the high modal density subsystems in a nonparametric stochastic way, i.e., carrying a diffuse wave field, and by subsequently computing the response probability density function. The master system's mean squared response amplitude follows a singular noncentral complex Wishart distribution conditional on the subsystem energies. For a single degree of freedom, this is equivalent to a chi-square or an exponential distribution, depending on the loading conditions. The subsystem energies follow approximately a chi-square distribution when their relative variance is smaller than unity. The results are validated by application to plate structures, and good agreement with Monte Carlo simulations is found. © 2012 Acoustical Society of America.
Resumo:
Changepoints are abrupt variations in the generative parameters of a data sequence. Online detection of changepoints is useful in modelling and prediction of time series in application areas such as finance, biometrics, and robotics. While frequentist methods have yielded online filtering and prediction techniques, most Bayesian papers have focused on the retrospective segmentation problem. Here we examine the case where the model parameters before and after the changepoint are independent and we derive an online algorithm for exact inference of the most recent changepoint. We compute the probability distribution of the length of the current ``run,'' or time since the last changepoint, using a simple message-passing algorithm. Our implementation is highly modular so that the algorithm may be applied to a variety of types of data. We illustrate this modularity by demonstrating the algorithm on three different real-world data sets.
Resumo:
Modern technology has allowed real-time data collection in a variety of domains, ranging from environmental monitoring to healthcare. Consequently, there is a growing need for algorithms capable of performing inferential tasks in an online manner, continuously revising their estimates to reflect the current status of the underlying process. In particular, we are interested in constructing online and temporally adaptive classifiers capable of handling the possibly drifting decision boundaries arising in streaming environments. We first make a quadratic approximation to the log-likelihood that yields a recursive algorithm for fitting logistic regression online. We then suggest a novel way of equipping this framework with self-tuning forgetting factors. The resulting scheme is capable of tracking changes in the underlying probability distribution, adapting the decision boundary appropriately and hence maintaining high classification accuracy in dynamic or unstable environments. We demonstrate the scheme's effectiveness in both real and simulated streaming environments. © Springer-Verlag 2009.
Resumo:
A pivotal problem in Bayesian nonparametrics is the construction of prior distributions on the space M(V) of probability measures on a given domain V. In principle, such distributions on the infinite-dimensional space M(V) can be constructed from their finite-dimensional marginals---the most prominent example being the construction of the Dirichlet process from finite-dimensional Dirichlet distributions. This approach is both intuitive and applicable to the construction of arbitrary distributions on M(V), but also hamstrung by a number of technical difficulties. We show how these difficulties can be resolved if the domain V is a Polish topological space, and give a representation theorem directly applicable to the construction of any probability distribution on M(V) whose first moment measure is well-defined. The proof draws on a projective limit theorem of Bochner, and on properties of set functions on Polish spaces to establish countable additivity of the resulting random probabilities.
Resumo:
Accurate simulation of rolling-tyre vibrations, and the associated noise, requires knowledge of road-surface topology. Full scans of the surface types in common use are, however, not widely available, and are likely to remain so. Ways of producing simulated surfaces from incomplete starting information are thus needed. In this paper, a simulation methodology based solely on line measurements is developed, and validated against a full two-dimensional height map of a real asphalt surface. First the tribological characteristics-asperity height, curvature and nearest-neighbour distributions-of the real surface are analysed. It is then shown that a standard simulation technique, which matches the (isotropic) spectrum and the probability distribution of the height measurements, is unable to reproduce these characteristics satisfactorily. A modification, whereby the inherent granularity of the surface is enforced at the initialisation stage, is introduced, and found to produce simulations whose tribological characteristics are in excellent agreement with the measurements. This method will thus make high-fidelity tyre-vibration calculations feasible for researchers with access to line-scan data only. In addition, the approach to surface tribological characterisation set out here provides a template for efficient cataloguing of road textures, as long as the resulting information can subsequently be used to produce sample realisations. A third simulation algorithm, which successfully addresses this requirement, is therefore also presented. © 2011 Elsevier B.V.
Resumo:
Motor behavior may be viewed as a problem of maximizing the utility of movement outcome in the face of sensory, motor and task uncertainty. Viewed in this way, and allowing for the availability of prior knowledge in the form of a probability distribution over possible states of the world, the choice of a movement plan and strategy for motor control becomes an application of statistical decision theory. This point of view has proven successful in recent years in accounting for movement under risk, inferring the loss function used in motor tasks, and explaining motor behavior in a wide variety of circumstances.
Resumo:
© 2012 Elsevier Ltd. Motor behavior may be viewed as a problem of maximizing the utility of movement outcome in the face of sensory, motor and task uncertainty. Viewed in this way, and allowing for the availability of prior knowledge in the form of a probability distribution over possible states of the world, the choice of a movement plan and strategy for motor control becomes an application of statistical decision theory. This point of view has proven successful in recent years in accounting for movement under risk, inferring the loss function used in motor tasks, and explaining motor behavior in a wide variety of circumstances.
Oxygen carrier dispersion in inert packed beds to improve performance in chemical looping combustion
Resumo:
Various packed beds of copper-based oxygen carriers (CuO on Al2O3) were tested over 100 cycles of low temperature (673K) Chemical Looping Combustion (CLC) with H2 as the fuel gas. The oxygen carriers were uniformly mixed with alumina (Al2O3) in order to investigate the level of separation necessary to prevent agglomeration. It was found that a mass ratio of 1:6 oxygen carrier to alumina gave the best performance in terms of stable, repeating hydrogen breakthrough curves over 100 cycles. In order to quantify the average separation achieved in the mixed packed beds, two sphere-packing models were developed. The hexagonal close-packing model assumed a uniform spherical packing structure, and based the separation calculations on a hypergeometric probability distribution. The more computationally intensive full-scale model used discrete element modelling to simulate random packing arrangements governed by gravity and contact dynamics. Both models predicted that average 'nearest neighbour' particle separation drops to near zero for oxygen carrier mass fractions of x≥0.25. For the packed bed systems studied, agglomeration was observed when the mass fraction of oxygen carrier was above this threshold. © 2013 Elsevier B.V.
Resumo:
Vibration and acoustic analysis at higher frequencies faces two challenges: computing the response without using an excessive number of degrees of freedom, and quantifying its uncertainty due to small spatial variations in geometry, material properties and boundary conditions. Efficient models make use of the observation that when the response of a decoupled vibro-acoustic subsystem is sufficiently sensitive to uncertainty in such spatial variations, the local statistics of its natural frequencies and mode shapes saturate to universal probability distributions. This holds irrespective of the causes that underly these spatial variations and thus leads to a nonparametric description of uncertainty. This work deals with the identification of uncertain parameters in such models by using experimental data. One of the difficulties is that both experimental errors and modeling errors, due to the nonparametric uncertainty that is inherent to the model type, are present. This is tackled by employing a Bayesian inference strategy. The prior probability distribution of the uncertain parameters is constructed using the maximum entropy principle. The likelihood function that is subsequently computed takes the experimental information, the experimental errors and the modeling errors into account. The posterior probability distribution, which is computed with the Markov Chain Monte Carlo method, provides a full uncertainty quantification of the identified parameters, and indicates how well their uncertainty is reduced, with respect to the prior information, by the experimental data. © 2013 Taylor & Francis Group, London.
Resumo:
Optimal Bayesian multi-target filtering is, in general, computationally impractical owing to the high dimensionality of the multi-target state. The Probability Hypothesis Density (PHD) filter propagates the first moment of the multi-target posterior distribution. While this reduces the dimensionality of the problem, the PHD filter still involves intractable integrals in many cases of interest. Several authors have proposed Sequential Monte Carlo (SMC) implementations of the PHD filter. However, these implementations are the equivalent of the Bootstrap Particle Filter, and the latter is well known to be inefficient. Drawing on ideas from the Auxiliary Particle Filter (APF), we present a SMC implementation of the PHD filter which employs auxiliary variables to enhance its efficiency. Numerical examples are presented for two scenarios, including a challenging nonlinear observation model.
Resumo:
This paper is concerned with the probability density function of the energy of a random dynamical system subjected to harmonic excitation. It is shown that if the natural frequencies and mode shapes of the system conform to the Gaussian Orthogonal Ensemble, then under common types of loading the distribution of the energy of the response is approximately lognormal, providing the modal overlap factor is high (typically greater than two). In contrast, it is shown that the response of a system with Poisson natural frequencies is not approximately lognormal. Numerical simulations are conducted on a plate system to validate the theoretical findings and good agreement is obtained. Simulations are also conducted on a system made from two plates connected with rotational springs to demonstrate that the theoretical findings can be extended to a built-up system. The work provides a theoretical justification of the commonly used empirical practice of assuming that the energy response of a random system is lognormal.
Resumo:
A location- and scale-invariant predictor is constructed which exhibits good probability matching for extreme predictions outside the span of data drawn from a variety of (stationary) general distributions. It is constructed via the three-parameter {\mu, \sigma, \xi} Generalized Pareto Distribution (GPD). The predictor is designed to provide matching probability exactly for the GPD in both the extreme heavy-tailed limit and the extreme bounded-tail limit, whilst giving a good approximation to probability matching at all intermediate values of the tail parameter \xi. The predictor is valid even for small sample sizes N, even as small as N = 3. The main purpose of this paper is to present the somewhat lengthy derivations which draw heavily on the theory of hypergeometric functions, particularly the Lauricella functions. Whilst the construction is inspired by the Bayesian approach to the prediction problem, it considers the case of vague prior information about both parameters and model, and all derivations are undertaken using sampling theory.