989 resultados para Statistical methodologies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Climate Change Act of 2008 the UK Government pledged to reduce carbon emissions by 80% by 2050. As one step towards this, regulations are being introduced requiring all new buildings to be ‘zero carbon’ by 2019. These are defined as buildingswhichemitnetzerocarbonduringtheiroperationallifetime.However,inordertomeetthe80%targetitisnecessary to reduce the carbon emitted during the whole life-cycle of buildings, including that emitted during the processes of construction. These elements make up the ‘embodied carbon’ of the building. While there are no regulations yet in place to restrictembodiedcarbon,anumberofdifferentapproacheshavebeenmade.Thereareseveralexistingdatabasesofembodied carbonandembodiedenergy.Mostprovidedataforthematerialextractionandmanufacturingonly,the‘cradletofactorygate’ phase. In addition to the databases, various software tools have been developed to calculate embodied energy and carbon of individual buildings. A third source of data comes from the research literature, in which individual life cycle analyses of buildings are reported. This paper provides a comprehensive review, comparing and assessing data sources, boundaries and methodologies. The paper concludes that the wide variations in these aspects produce incomparable results. It highlights the areas where existing data is reliable, and where new data and more precise methods are needed. This comprehensive review will guide the future development of a consistent and transparent database and software tool to calculate the embodied energy and carbon of buildings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Amplitude demodulation is an ill-posed problem and so it is natural to treat it from a Bayesian viewpoint, inferring the most likely carrier and envelope under probabilistic constraints. One such treatment is Probabilistic Amplitude Demodulation (PAD), which, whilst computationally more intensive than traditional approaches, offers several advantages. Here we provide methods for estimating the uncertainty in the PAD-derived envelopes and carriers, and for learning free-parameters like the time-scale of the envelope. We show how the probabilistic approach can naturally handle noisy and missing data. Finally, we indicate how to extend the model to signals which contain multiple modulators and carriers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Flows throughout different zones of turbines have been investigated using large eddy simulation (LES) and hybrid Reynolds-averaged Navier–Stokes-LES (RANS-LES) methods and contrasted with RANS modeling, which is more typically used in the design environment. The studied cases include low and high-pressure turbine cascades, real surface roughness effects, internal cooling ducts, trailing edge cut-backs, and labyrinth and rim seals. Evidence is presented that shows that LES and hybrid RANS-LES produces higher quality data than RANS/URANS for a wide range of flows. The higher level of physics that is resolved allows for greater flow physics insight, which is valuable for improving designs and refining lower order models. Turbine zones are categorized by flow type to assist in choosing the appropriate eddy resolving method and to estimate the computational cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical dialog systems (SDSs) are motivated by the need for a data-driven framework that reduces the cost of laboriously handcrafting complex dialog managers and that provides robustness against the errors created by speech recognizers operating in noisy environments. By including an explicit Bayesian model of uncertainty and by optimizing the policy via a reward-driven process, partially observable Markov decision processes (POMDPs) provide such a framework. However, exact model representation and optimization is computationally intractable. Hence, the practical application of POMDP-based systems requires efficient algorithms and carefully constructed approximations. This review article provides an overview of the current state of the art in the development of POMDP-based spoken dialog systems. © 1963-2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report an empirical study of n-gram posterior probability confidence measures for statistical machine translation (SMT). We first describe an efficient and practical algorithm for rapidly computing n-gram posterior probabilities from large translation word lattices. These probabilities are shown to be a good predictor of whether or not the n-gram is found in human reference translations, motivating their use as a confidence measure for SMT. Comprehensive n-gram precision and word coverage measurements are presented for a variety of different language pairs, domains and conditions. We analyze the effect on reference precision of using single or multiple references, and compare the precision of posteriors computed from k-best lists to those computed over the full evidence space of the lattice. We also demonstrate improved confidence by combining multiple lattices in a multi-source translation framework. © 2012 The Author(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When used correctly, Statistical Energy Analysis (SEA) can provide good predictions of high frequency vibration levels in built-up structures. Unfortunately, the assumptions that underlie SEA break down as the frequency of excitation is reduced, and the method does not yield accurate predictions at "medium" frequencies (and neither does the Finite Element Method, which is limited to low frequencies). A basic problem is that parts of the system have a short wavelength of deformation and meet the requirements of SEA, while other parts of the system do not - this is often referred to as the "mid-frequency" problem, and there is a broad class of mid-frequency vibration problems that are of great concern to industry. In this paper, a coupled deterministic-statistical approach referred to as the Hybrid Method (Shorter & Langley, 2004) is briefly described, and some results that demonstrate how the method overcomes the aforementioned difficulties are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a statistical approach to the electromagnetic analysis of a system that lies within a reverberant cavity that has random or uncertain properties. The need to solve Maxwell's equations within the cavity is avoided by employing a relation known as the diffuse field reciprocity principle, which leads directly to the ensemble mean squared response of the system; all that is required is the impedance matrix of the system associated with radiation into infinite space. The general theoretical approach is presented, and the analysis is then applied to a five-cable bundle in a reverberation room © 2013 EMC Europe Foundation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the development of efficient algorithms for propagating parametric uncertainty within the context of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) approach to the analysis of complex vibro-acoustic systems. This approach models the system as a combination of SEA subsystems and FE components; it is assumed that the FE components have fully deterministic properties, while the SEA subsystems have a high degree of randomness. The method has been recently generalised by allowing the FE components to possess parametric uncertainty, leading to two ensembles of uncertainty: a non-parametric one (SEA subsystems) and a parametric one (FE components). The SEA subsystems ensemble is dealt with analytically, while the effect of the additional FE components ensemble can be dealt with by Monte Carlo Simulations. However, this approach can be computationally intensive when applied to complex engineering systems having many uncertain parameters. Two different strategies are proposed: (i) the combination of the hybrid FE/SEA method with the First Order Reliability Method which allows the probability of the non-parametric ensemble average of a response variable exceeding a barrier to be calculated and (ii) the combination of the hybrid FE/SEA method with Laplace's method which allows the evaluation of the probability of a response variable exceeding a limit value. The proposed approaches are illustrated using two built-up plate systems with uncertain properties and the results are validated against direct integration, Monte Carlo simulations of the FE and of the hybrid FE/SEA models. © 2013 Elsevier Ltd.