990 resultados para uncertainty evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objective of this research project was to investigate how Iowa fly ashes influenced the chemical durability of portland cement based materials. Chemical durability has become an area of uncertainty because of the winter application of deicer salts (rock salts) that contain a significant amount of sulfate impurities. The sulfate durability testing program consisted of monitoring portland cement-fly ash paste, mortar and concrete test specimens that had been subjected to aqueous solutions containing various concentrations of salts (both sulfate and chloride). The paste and mortar specimens were monitored for length as a function of time. The concrete test specimens were monitored for length, relative dynamic modulus and mass as a function of time. The alkali-aggregate reactivity testing program consisted of monitoring the expansion of ASTM C311 mortar bar specimens that contained three different aggregates (Pyrex glass, Oreapolis and standard Ottawa sand). The results of the sulfate durability study indicated that the paste and concrete test specimens tended to exhibit surface spalling but only very slow expansive tendencies. This suggested that the permeability of the test specimens was controlling the rate of deterioration. Concrete specimens are still being monitored because the majority of the test specimens have expanded less than 0.05%; hence, this makes it difficult to estimate the service life of the concrete test specimens or to quantify the performance of the different fly ashes that were used in the study. The results of the mortar bar studies indicated that the chemical composition of the various fly ashes did have an influence on their sulfate resistance. Typically, Clinton and Louisa fly ashes performed the best, followed by the Ottumwa, Neal 4 and then Council Bluffs fly ashes. Council Bluffs fly ash was the only fly ash that consistently reduced the sulfate resistance of the many different mortar specimens that were investigated during this study. None of the trends that were observed in the mortar bar studies have yet become evident in the concrete phase of this project. The results of the alkali-aggregate study indicated that the Oreapolis aggregate is not very sensitive to alkali attack. Two of the fly ashes, Council Bluffs and Ottumwa, tended to increase the expansion of mortar bar specimens that contained the Oreapolis aggregate. However, it was not clear if the additional expansion was due to the alkali content of the fly ash, the periclase content of the fly ash or the cristobalite content of the fly ash, since all three of these factors have been found to influence the test results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of energy gap(s) is useful for understanding the consequence of a small daily, weekly, or monthly positive energy balance and the inconspicuous shift in weight gain ultimately leading to overweight and obesity. Energy gap is a dynamic concept: an initial positive energy gap incurred via an increase in energy intake (or a decrease in physical activity) is not constant, may fade out with time if the initial conditions are maintained, and depends on the 'efficiency' with which the readjustment of the energy imbalance gap occurs with time. The metabolic response to an energy imbalance gap and the magnitude of the energy gap(s) can be estimated by at least two methods, i.e. i) assessment by longitudinal overfeeding studies, imposing (by design) an initial positive energy imbalance gap; ii) retrospective assessment based on epidemiological surveys, whereby the accumulated endogenous energy storage per unit of time is calculated from the change in body weight and body composition. In order to illustrate the difficulty of accurately assessing an energy gap we have used, as an illustrative example, a recent epidemiological study which tracked changes in total energy intake (estimated by gross food availability) and body weight over 3 decades in the US, combined with total energy expenditure prediction from body weight using doubly labelled water data. At the population level, the study attempted to assess the cause of the energy gap purported to be entirely due to increased food intake. Based on an estimate of change in energy intake judged to be more reliable (i.e. in the same study population) and together with calculations of simple energetic indices, our analysis suggests that conclusions about the fundamental causes of obesity development in a population (excess intake vs. low physical activity or both) is clouded by a high level of uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’insomnie, commune auprès de la population gériatrique, est typiquement traitée avec des benzodiazépines qui peuvent augmenter le risque des chutes. La thérapie cognitive-comportementale (TCC) est une intervention non-pharmacologique ayant une efficacité équivalente et aucun effet secondaire. Dans la présente thèse, le coût des benzodiazépines (BZD) sera comparé à celui de la TCC dans le traitement de l’insomnie auprès d’une population âgée, avec et sans considération du coût additionnel engendré par les chutes reliées à la prise des BZD. Un modèle d’arbre décisionnel a été conçu et appliqué selon la perspective du système de santé sur une période d’un an. Les probabilités de chutes, de visites à l’urgence, d’hospitalisation avec et sans fracture de la hanche, les données sur les coûts et sur les utilités ont été recueillies à partir d’une revue de la littérature. Des analyses sur le coût des conséquences, sur le coût-utilité et sur les économies potentielles ont été faites. Des analyses de sensibilité probabilistes et déterministes ont permis de prendre en considération les estimations des données. Le traitement par BZD coûte 30% fois moins cher que TCC si les coûts reliés aux chutes ne sont pas considérés (231$ CAN vs 335$ CAN/personne/année). Lorsque le coût relié aux chutes est pris en compte, la TCC s’avère être l’option la moins chère (177$ CAN d’économie absolue/ personne/année, 1,357$ CAN avec les BZD vs 1,180$ pour la TCC). La TCC a dominé l’utilisation des BZD avec une économie moyenne de 25, 743$ CAN par QALY à cause des chutes moins nombreuses observées avec la TCC. Les résultats des analyses d’économies d’argent suggèrent que si la TCC remplaçait le traitement par BZD, l’économie annuelle directe pour le traitement de l’insomnie serait de 441 millions de dollars CAN avec une économie cumulative de 112 billions de dollars canadiens sur une période de cinq ans. D’après le rapport sensibilité, le traitement par BZD coûte en moyenne 1,305$ CAN, écart type 598$ (étendue : 245-2,625)/personne/année alors qu’il en coûte moyenne 1,129$ CAN, écart type 514$ (étendue : 342-2,526)/personne/année avec la TCC. Les options actuelles de remboursement de traitements pharmacologiques au lieu des traitements non-pharmacologiques pour l’insomnie chez les personnes âgées ne permettent pas d’économie de coûts et ne sont pas recommandables éthiquement dans une perspective du système de santé.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The water vapour continuum absorption is an important component of molecular absorption of radiation in atmosphere. However, uncertainty in knowledge of the value of the continuum absorption at present can achieve 100% in different spectral regions leading to an error in flux calculation up to 3-5 W/m2 global mean. This work uses line-by-line calculations to reveal the best spectral intervals for experimental verification of the CKD water vapour continuum models in the currently least studied near-infrared spectral region. Possible sources of errors in continuum retrieval taken into account in the simulation include the sensitivity of laboratory spectrometers and uncertainties in the spectral line parameters in HITRAN-2004 and Schwenke-Partridge database. It is shown that a number of micro-windows in near-IR can be used at present for laboratory detection of the water vapour continuum with estimated accuracy from 30 to 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, Bayesian decision procedures previously proposed for dose-escalation studies in healthy volunteers are reviewed and evaluated. Modifications are made to the expression of the prior distribution in order to make the procedure simpler to implement and a more relevant criterion for optimality is introduced. The results of an extensive simulation exercise to establish the proper-ties of the procedure and to aid choice between designs are summarized, and the way in which readers can use simulation to choose a design for their own trials is described. The influence of the value of the within-subject correlation on the procedure is investigated and the use of a simple prior to reflect uncertainty about the correlation is explored. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the significance of forecasting in real estate investment decisions, this paper investigates forecast uncertainty and disagreement in real estate market forecasts. It compares the performance of real estate forecasters with non-real estate forecasters. Using the Investment Property Forum (IPF) quarterly survey amongst UK independent real estate forecasters and a similar survey of macro-economic and capital market forecasters, these forecasts are compared with actual performance to assess a number of forecasting issues in the UK over 1999-2004, including forecast error, bias and consensus. The results suggest that both groups are biased, less volatile compared to market returns and inefficient in that forecast errors tend to persist. The strongest finding is that forecasters display the characteristics associated with a consensus indicating herding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evaluation of the quality and usefulness of climate modeling systems is dependent upon an assessment of both the limited predictability of the climate system and the uncertainties stemming from model formulation. In this study a methodology is presented that is suited to assess the performance of a regional climate model (RCM), based on its ability to represent the natural interannual variability on monthly and seasonal timescales. The methodology involves carrying out multiyear ensemble simulations (to assess the predictability bounds within which the model can be evaluated against observations) and multiyear sensitivity experiments using different model formulations (to assess the model uncertainty). As an example application, experiments driven by assimilated lateral boundary conditions and sea surface temperatures from the ECMWF Reanalysis Project (ERA-15, 1979–1993) were conducted. While the ensemble experiment demonstrates that the predictability of the regional climate varies strongly between different seasons and regions, being weakest during the summer and over continental regions, important sensitivities of the modeling system to parameterization choices are uncovered. In particular, compensating mechanisms related to the long-term representation of the water cycle are revealed, in which summer dry and hot conditions at the surface, resulting from insufficient evaporation, can persist despite insufficient net solar radiation (a result of unrealistic cloud-radiative feedbacks).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional “climate modeling” source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An urban energy and water balance model is presented which uses a small number of commonly measured meteorological variables and information about the surface cover. Rates of evaporation-interception for a single layer with multiple surface types (paved, buildings, coniferous trees and/or shrubs, deciduous trees and/or shrubs, irrigated grass, non-irrigated grass and water) are calculated. Below each surface type, except water, there is a single soil layer. At each time step the moisture state of each surface is calculated. Horizontal water movements at the surface and in the soil are incorporated. Particular attention is given to the surface conductance used to model evaporation and its parameters. The model is tested against direct flux measurements carried out over a number of years in Vancouver, Canada and Los Angeles, USA. At all measurement sites the model is able to simulate the net all-wave radiation and turbulent sensible and latent heat well (RMSE = 25–47 W m−2, 30–64 and 20–56 W m−2, respectively). The model reproduces the diurnal cycle of the turbulent fluxes but typically underestimates latent heat flux and overestimates sensible heat flux in the day time. The model tracks measured surface wetness and simulates the variations in soil moisture content. It is able to respond correctly to short-term events as well as annual changes. The largest uncertainty relates to the determination of surface conductance. The model has the potential be used for multiple applications; for example, to predict effects of regulation on urban water use, landscaping and planning scenarios, or to assess climate mitigation strategies.