151 resultados para statistical discrimination
Resumo:
Amplitude demodulation is an ill-posed problem and so it is natural to treat it from a Bayesian viewpoint, inferring the most likely carrier and envelope under probabilistic constraints. One such treatment is Probabilistic Amplitude Demodulation (PAD), which, whilst computationally more intensive than traditional approaches, offers several advantages. Here we provide methods for estimating the uncertainty in the PAD-derived envelopes and carriers, and for learning free-parameters like the time-scale of the envelope. We show how the probabilistic approach can naturally handle noisy and missing data. Finally, we indicate how to extend the model to signals which contain multiple modulators and carriers.
Resumo:
Statistical dialog systems (SDSs) are motivated by the need for a data-driven framework that reduces the cost of laboriously handcrafting complex dialog managers and that provides robustness against the errors created by speech recognizers operating in noisy environments. By including an explicit Bayesian model of uncertainty and by optimizing the policy via a reward-driven process, partially observable Markov decision processes (POMDPs) provide such a framework. However, exact model representation and optimization is computationally intractable. Hence, the practical application of POMDP-based systems requires efficient algorithms and carefully constructed approximations. This review article provides an overview of the current state of the art in the development of POMDP-based spoken dialog systems. © 1963-2012 IEEE.
Resumo:
We report an empirical study of n-gram posterior probability confidence measures for statistical machine translation (SMT). We first describe an efficient and practical algorithm for rapidly computing n-gram posterior probabilities from large translation word lattices. These probabilities are shown to be a good predictor of whether or not the n-gram is found in human reference translations, motivating their use as a confidence measure for SMT. Comprehensive n-gram precision and word coverage measurements are presented for a variety of different language pairs, domains and conditions. We analyze the effect on reference precision of using single or multiple references, and compare the precision of posteriors computed from k-best lists to those computed over the full evidence space of the lattice. We also demonstrate improved confidence by combining multiple lattices in a multi-source translation framework. © 2012 The Author(s).
Resumo:
When used correctly, Statistical Energy Analysis (SEA) can provide good predictions of high frequency vibration levels in built-up structures. Unfortunately, the assumptions that underlie SEA break down as the frequency of excitation is reduced, and the method does not yield accurate predictions at "medium" frequencies (and neither does the Finite Element Method, which is limited to low frequencies). A basic problem is that parts of the system have a short wavelength of deformation and meet the requirements of SEA, while other parts of the system do not - this is often referred to as the "mid-frequency" problem, and there is a broad class of mid-frequency vibration problems that are of great concern to industry. In this paper, a coupled deterministic-statistical approach referred to as the Hybrid Method (Shorter & Langley, 2004) is briefly described, and some results that demonstrate how the method overcomes the aforementioned difficulties are presented.
Resumo:
This paper presents a statistical approach to the electromagnetic analysis of a system that lies within a reverberant cavity that has random or uncertain properties. The need to solve Maxwell's equations within the cavity is avoided by employing a relation known as the diffuse field reciprocity principle, which leads directly to the ensemble mean squared response of the system; all that is required is the impedance matrix of the system associated with radiation into infinite space. The general theoretical approach is presented, and the analysis is then applied to a five-cable bundle in a reverberation room © 2013 EMC Europe Foundation.
Resumo:
Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.
Resumo:
This paper is concerned with the development of efficient algorithms for propagating parametric uncertainty within the context of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) approach to the analysis of complex vibro-acoustic systems. This approach models the system as a combination of SEA subsystems and FE components; it is assumed that the FE components have fully deterministic properties, while the SEA subsystems have a high degree of randomness. The method has been recently generalised by allowing the FE components to possess parametric uncertainty, leading to two ensembles of uncertainty: a non-parametric one (SEA subsystems) and a parametric one (FE components). The SEA subsystems ensemble is dealt with analytically, while the effect of the additional FE components ensemble can be dealt with by Monte Carlo Simulations. However, this approach can be computationally intensive when applied to complex engineering systems having many uncertain parameters. Two different strategies are proposed: (i) the combination of the hybrid FE/SEA method with the First Order Reliability Method which allows the probability of the non-parametric ensemble average of a response variable exceeding a barrier to be calculated and (ii) the combination of the hybrid FE/SEA method with Laplace's method which allows the evaluation of the probability of a response variable exceeding a limit value. The proposed approaches are illustrated using two built-up plate systems with uncertain properties and the results are validated against direct integration, Monte Carlo simulations of the FE and of the hybrid FE/SEA models. © 2013 Elsevier Ltd.
Resumo:
A novel launch scheme is proposed for multimode-fiber (MMF) links. Enhanced performance in 10 Gb/s MMF links using electronic equalization is demonstrated by statistical analysis of installed-base fiber and an experimental investigation. © 2007 Optical Society of America.
Resumo:
Rigorous statistical analysis is applied for the first time to identify optimal launch conditions and carrier frequencies for SCM transmission over worst-case MMF. The feasibility of multichannel schemes for 10 Gb/s over 300 m is demonstrated. © 2005 Optical Society of America.