61 resultados para Probability Metrics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores the current state-of-the-art in performance indicators and use of probabilistic approaches used in climate change impact studies. It presents a critical review of recent publications in this field, focussing on (1) metrics for energy use for heating and cooling, emissions, overheating and high-level performance aspects, and (2) uptake of uncertainty and risk analysis. This is followed by a case study, which is used to explore some of the contextual issues around the broader uptake of climate change impact studies in practice. The work concludes that probabilistic predictions of the impact of climate change are feasible, but only based on strict and explicitly stated assumptions. © 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change is expected to have significant impact on the future thermal performance of buildings. Building simulation and sensitivity analysis can be employed to predict these impacts, guiding interventions to adapt buildings to future conditions. This article explores the use of simulation to study the impact of climate change on a theoretical office building in the UK, employing a probabilistic approach. The work studies (1) appropriate performance metrics and underlying modelling assumptions, (2) sensitivity of computational results to identify key design parameters and (3) the impact of zonal resolution. The conclusions highlight the importance of assumptions in the field of electricity conversion factors, proper management of internal heat gains, and the need to use an appropriately detailed zonal resolution. © 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The vibro-acoustic response of built-up structures, consisting of stiff components with low modal density and flexible components with high modal density, is sensitive to small imperfections in the flexible components. In this paper, the uncertainty of the response is considered by modeling the low modal density master system as deterministic and the high modal density subsystems in a nonparametric stochastic way, i.e., carrying a diffuse wave field, and by subsequently computing the response probability density function. The master system's mean squared response amplitude follows a singular noncentral complex Wishart distribution conditional on the subsystem energies. For a single degree of freedom, this is equivalent to a chi-square or an exponential distribution, depending on the loading conditions. The subsystem energies follow approximately a chi-square distribution when their relative variance is smaller than unity. The results are validated by application to plate structures, and good agreement with Monte Carlo simulations is found. © 2012 Acoustical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric effects can significantly degrade the reliability of free-space optical communications. One such effect is scintillation, caused by atmospheric turbulence, refers to random fluctuations in the irradiance and phase of the received laser beam. In this paper we inv stigate the use of multiple lasers and multiple apertures to mitigate scintillation. Since the scintillation process is slow, we adopt a block fading channel model and study the outage probability under the assumptions of orthogonal pulse-position modulation and non-ideal photodetection. Assuming perfect receiver channel state information (CSI), we derive the signal-to-noise ratio (SNR) exponents for the cases when the scintillation is lognormal, exponential and gammagamma distributed, which cover a wide range of atmospheric turbulence conditions. Furthermore, when CSI is also available at the transmitter, we illustrate very large gains in SNR are possible (in some cases larger than 15 dB) by adapting the transmitted power. Under a long-term power constraint, we outline fundamental design criteria via a simple expression that relates the required number of lasers and apertures for a given code rate and number of codeword blocks to completely remove system outages. Copyright © 2009 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report an empirical study of n-gram posterior probability confidence measures for statistical machine translation (SMT). We first describe an efficient and practical algorithm for rapidly computing n-gram posterior probabilities from large translation word lattices. These probabilities are shown to be a good predictor of whether or not the n-gram is found in human reference translations, motivating their use as a confidence measure for SMT. Comprehensive n-gram precision and word coverage measurements are presented for a variety of different language pairs, domains and conditions. We analyze the effect on reference precision of using single or multiple references, and compare the precision of posteriors computed from k-best lists to those computed over the full evidence space of the lattice. We also demonstrate improved confidence by combining multiple lattices in a multi-source translation framework. © 2012 The Author(s).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A location- and scale-invariant predictor is constructed which exhibits good probability matching for extreme predictions outside the span of data drawn from a variety of (stationary) general distributions. It is constructed via the three-parameter {\mu, \sigma, \xi} Generalized Pareto Distribution (GPD). The predictor is designed to provide matching probability exactly for the GPD in both the extreme heavy-tailed limit and the extreme bounded-tail limit, whilst giving a good approximation to probability matching at all intermediate values of the tail parameter \xi. The predictor is valid even for small sample sizes N, even as small as N = 3. The main purpose of this paper is to present the somewhat lengthy derivations which draw heavily on the theory of hypergeometric functions, particularly the Lauricella functions. Whilst the construction is inspired by the Bayesian approach to the prediction problem, it considers the case of vague prior information about both parameters and model, and all derivations are undertaken using sampling theory.