169 resultados para Statistical variance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atlases and statistical models play important roles in the personalization and simulation of cardiac physiology. For the study of the heart, however, the construction of comprehensive atlases and spatio-temporal models is faced with a number of challenges, in particular the need to handle large and highly variable image datasets, the multi-region nature of the heart, and the presence of complex as well as small cardiovascular structures. In this paper, we present a detailed atlas and spatio-temporal statistical model of the human heart based on a large population of 3D+time multi-slice computed tomography sequences, and the framework for its construction. It uses spatial normalization based on nonrigid image registration to synthesize a population mean image and establish the spatial relationships between the mean and the subjects in the population. Temporal image registration is then applied to resolve each subject-specific cardiac motion and the resulting transformations are used to warp a surface mesh representation of the atlas to fit the images of the remaining cardiac phases in each subject. Subsequently, we demonstrate the construction of a spatio-temporal statistical model of shape such that the inter-subject and dynamic sources of variation are suitably separated. The framework is applied to a 3D+time data set of 138 subjects. The data is drawn from a variety of pathologies, which benefits its generalization to new subjects and physiological studies. The obtained level of detail and the extendability of the atlas present an advantage over most cardiac models published previously. © 1982-2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amplitude demodulation is an ill-posed problem and so it is natural to treat it from a Bayesian viewpoint, inferring the most likely carrier and envelope under probabilistic constraints. One such treatment is Probabilistic Amplitude Demodulation (PAD), which, whilst computationally more intensive than traditional approaches, offers several advantages. Here we provide methods for estimating the uncertainty in the PAD-derived envelopes and carriers, and for learning free-parameters like the time-scale of the envelope. We show how the probabilistic approach can naturally handle noisy and missing data. Finally, we indicate how to extend the model to signals which contain multiple modulators and carriers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An existing hybrid finite element (FE)/statistical energy analysis (SEA) approach to the analysis of the mid- and high frequency vibrations of a complex built-up system is extended here to a wider class of uncertainty modeling. In the original approach, the constituent parts of the system are considered to be either deterministic, and modeled using FE, or highly random, and modeled using SEA. A non-parametric model of randomness is employed in the SEA components, based on diffuse wave theory and the Gaussian Orthogonal Ensemble (GOE), and this enables the mean and variance of second order quantities such as vibrational energy and response cross-spectra to be predicted. In the present work the assumption that the FE components are deterministic is relaxed by the introduction of a parametric model of uncertainty in these components. The parametric uncertainty may be modeled either probabilistically, or by using a non-probabilistic approach such as interval analysis, and it is shown how these descriptions can be combined with the non-parametric uncertainty in the SEA subsystems to yield an overall assessment of the performance of the system. The method is illustrated by application to an example built-up plate system which has random properties, and benchmark comparisons are made with full Monte Carlo simulations. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical dialog systems (SDSs) are motivated by the need for a data-driven framework that reduces the cost of laboriously handcrafting complex dialog managers and that provides robustness against the errors created by speech recognizers operating in noisy environments. By including an explicit Bayesian model of uncertainty and by optimizing the policy via a reward-driven process, partially observable Markov decision processes (POMDPs) provide such a framework. However, exact model representation and optimization is computationally intractable. Hence, the practical application of POMDP-based systems requires efficient algorithms and carefully constructed approximations. This review article provides an overview of the current state of the art in the development of POMDP-based spoken dialog systems. © 1963-2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report an empirical study of n-gram posterior probability confidence measures for statistical machine translation (SMT). We first describe an efficient and practical algorithm for rapidly computing n-gram posterior probabilities from large translation word lattices. These probabilities are shown to be a good predictor of whether or not the n-gram is found in human reference translations, motivating their use as a confidence measure for SMT. Comprehensive n-gram precision and word coverage measurements are presented for a variety of different language pairs, domains and conditions. We analyze the effect on reference precision of using single or multiple references, and compare the precision of posteriors computed from k-best lists to those computed over the full evidence space of the lattice. We also demonstrate improved confidence by combining multiple lattices in a multi-source translation framework. © 2012 The Author(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When used correctly, Statistical Energy Analysis (SEA) can provide good predictions of high frequency vibration levels in built-up structures. Unfortunately, the assumptions that underlie SEA break down as the frequency of excitation is reduced, and the method does not yield accurate predictions at "medium" frequencies (and neither does the Finite Element Method, which is limited to low frequencies). A basic problem is that parts of the system have a short wavelength of deformation and meet the requirements of SEA, while other parts of the system do not - this is often referred to as the "mid-frequency" problem, and there is a broad class of mid-frequency vibration problems that are of great concern to industry. In this paper, a coupled deterministic-statistical approach referred to as the Hybrid Method (Shorter & Langley, 2004) is briefly described, and some results that demonstrate how the method overcomes the aforementioned difficulties are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a statistical approach to the electromagnetic analysis of a system that lies within a reverberant cavity that has random or uncertain properties. The need to solve Maxwell's equations within the cavity is avoided by employing a relation known as the diffuse field reciprocity principle, which leads directly to the ensemble mean squared response of the system; all that is required is the impedance matrix of the system associated with radiation into infinite space. The general theoretical approach is presented, and the analysis is then applied to a five-cable bundle in a reverberation room © 2013 EMC Europe Foundation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.