925 resultados para Data uncertainty
Resumo:
Researchers in ecology commonly use multivariate analyses (e.g. redundancy analysis, canonical correspondence analysis, Mantel correlation, multivariate analysis of variance) to interpret patterns in biological data and relate these patterns to environmental predictors. There has been, however, little recognition of the errors associated with biological data and the influence that these may have on predictions derived from ecological hypotheses. We present a permutational method that assesses the effects of taxonomic uncertainty on the multivariate analyses typically used in the analysis of ecological data. The procedure is based on iterative randomizations that randomly re-assign non identified species in each site to any of the other species found in the remaining sites. After each re-assignment of species identities, the multivariate method at stake is run and a parameter of interest is calculated. Consequently, one can estimate a range of plausible values for the parameter of interest under different scenarios of re-assigned species identities. We demonstrate the use of our approach in the calculation of two parameters with an example involving tropical tree species from western Amazonia: 1) the Mantel correlation between compositional similarity and environmental distances between pairs of sites, and; 2) the variance explained by environmental predictors in redundancy analysis (RDA). We also investigated the effects of increasing taxonomic uncertainty (i.e. number of unidentified species), and the taxonomic resolution at which morphospecies are determined (genus-resolution, family-resolution, or fully undetermined species) on the uncertainty range of these parameters. To achieve this, we performed simulations on a tree dataset from southern Mexico by randomly selecting a portion of the species contained in the dataset and classifying them as unidentified at each level of decreasing taxonomic resolution. An analysis of covariance showed that both taxonomic uncertainty and resolution significantly influence the uncertainty range of the resulting parameters. Increasing taxonomic uncertainty expands our uncertainty of the parameters estimated both in the Mantel test and RDA. The effects of increasing taxonomic resolution, however, are not as evident. The method presented in this study improves the traditional approaches to study compositional change in ecological communities by accounting for some of the uncertainty inherent to biological data. We hope that this approach can be routinely used to estimate any parameter of interest obtained from compositional data tables when faced with taxonomic uncertainty.
Resumo:
T actitivity in LiPb LiPb mock-up material irradiated in Frascati: measurement and MCNP results
Resumo:
PART I:Cross-section uncertainties under differentneutron spectra. PART II: Processing uncertainty libraries
Resumo:
This work is aimed to present the main differences of nuclear data uncertainties among three different nuclear data libraries: EAF-2007, EAF-2010 and SCALE-6.0, under different neutron spectra: LWR, ADS and DEMO (fusion)
Resumo:
The accurate prediction of the spent nuclear fuel content is essential for its safe and optimized transportation, storage and management. This isotopic evolution can be predicted using powerful codes and methodologies throughout irradiation as well as cooling time periods. However, in order to have a realistic confidence level in the prediction of spent fuel isotopic content, it is desirable to determine how uncertainties affect isotopic prediction calculations by quantifying their associated uncertainties.
Resumo:
The influence of applying European default traffic values to the making of a noise map was evaluated in a typical environment like Palma de Mallorca. To assess these default traffic values, a first model has been created and compared with measured noise levels. Subsequently a second traffic model, improving the input data used for the first one, has been created and validated according to the deviations. Different methodologies were also examined for collecting model input data that would be of higher quality, by analysing the improvement generated in the reduction in the uncertainty of the noise map introduced by the road traffic noise emission
Resumo:
This work is aimed to present the main differences of nuclear data uncertainties among three different nuclear data libraries: EAF-2007, EAF-2010 and SCALE-6.0, under different neutron spectra: LWR, ADS and DEMO (fusion). To take into account the neutron spectrum, the uncertainty data are collapsed to onegroup. That is a simple way to see the differences among libraries for one application. Also, the neutron spectrum effect on different applications can be observed. These comparisons are presented only for (n,fission), (n,gamma) and (n,p) reactions, for the main transuranic isotopes (234,235,236,238U, 237Np, 238,239,240,241Pu, 241,242m,243Am, 242,243,244,245,246,247,248Cm, 249Bk, 249,250,251,252Cf). But also general comparisons among libraries are presented taking into account all included isotopes. In other works, target accuracies are presented for nuclear data uncertainties; here, these targets are compared with uncertainties on the above libraries. The main results of these comparisons are that EAF-2010 has reduced their uncertainties for many isotopes from EAF-2007 for (n,gamma) and (n,fission) but not for (n,p); SCALE-6.0 gives lower uncertainties for (n,fission) reactions for ADS and PWR applications, but gives higher uncertainties for (n,p) reactions in all applications. For the (n,gamma) reaction, the amount of isotopes which have higher uncertainties is quite similar to the amount of isotopes which have lower uncertainties when SCALE-6.0 and EAF-2010 are compared. When the effect of neutron spectra is analysed, the ADS neutron spectrum obtained the highest uncertainties for (n,gamma) and (n,fission) reactions of all libraries.
Resumo:
cartografía de incertidumbres
Resumo:
A validation of the burn-up simulation system EVOLCODE 2.0 is presented here, involving the experimental measurement of U and Pu isotopes and some fission fragments production ratios after a burn-up of around 30 GWd/tU in a Pressurized Light Water Reactor (PWR). This work provides an in-depth analysis of the validation results, including the possible sources of the uncertainties. An uncertainty analysis based on the sensitivity methodology has been also performed, providing the uncertainties in the isotopic content propagated from the cross sections uncertainties. An improvement of the classical Sensitivity/ Uncertainty (S/U) model has been developed to take into account the implicit dependence of the neutron flux normalization, that is, the effect of the constant power of the reactor. The improved S/U methodology, neglected in this kind of studies, has proven to be an important contribution to the explanation of some simulation-experiment discrepancies for which, in general, the cross section uncertainties are, for the most relevant actinides, an important contributor to the simulation uncertainties, of the same order of magnitude and sometimes even larger than the experimental uncertainties and the experiment- simulation differences. Additionally, some hints for the improvement of the JEFF3.1.1 fission yield library and for the correction of some errata in the experimental data are presented.
Resumo:
Contrast sensitivity improves with the area of a sine-wave grating, but why? Here we assess this phenomenon against contemporary models involving spatial summation, probability summation, uncertainty, and stochastic noise. Using a two-interval forced-choice procedure we measured contrast sensitivity for circular patches of sine-wave gratings with various diameters that were blocked or interleaved across trials to produce low and high extrinsic uncertainty, respectively. Summation curves were steep initially, becoming shallower thereafter. For the smaller stimuli, sensitivity was slightly worse for the interleaved design than for the blocked design. Neither area nor blocking affected the slope of the psychometric function. We derived model predictions for noisy mechanisms and extrinsic uncertainty that was either low or high. The contrast transducer was either linear (c1.0) or nonlinear (c2.0), and pooling was either linear or a MAX operation. There was either no intrinsic uncertainty, or it was fixed or proportional to stimulus size. Of these 10 canonical models, only the nonlinear transducer with linear pooling (the noisy energy model) described the main forms of the data for both experimental designs. We also show how a cross-correlator can be modified to fit our results and provide a contemporary presentation of the relation between summation and the slope of the psychometric function.
Resumo:
The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.
Resumo:
Groundwater systems of different densities are often mathematically modeled to understand and predict environmental behavior such as seawater intrusion or submarine groundwater discharge. Additional data collection may be justified if it will cost-effectively aid in reducing the uncertainty of a model's prediction. The collection of salinity, as well as, temperature data could aid in reducing predictive uncertainty in a variable-density model. However, before numerical models can be created, rigorous testing of the modeling code needs to be completed. This research documents the benchmark testing of a new modeling code, SEAWAT Version 4. The benchmark problems include various combinations of density-dependent flow resulting from variations in concentration and temperature. The verified code, SEAWAT, was then applied to two different hydrological analyses to explore the capacity of a variable-density model to guide data collection. ^ The first analysis tested a linear method to guide data collection by quantifying the contribution of different data types and locations toward reducing predictive uncertainty in a nonlinear variable-density flow and transport model. The relative contributions of temperature and concentration measurements, at different locations within a simulated carbonate platform, for predicting movement of the saltwater interface were assessed. Results from the method showed that concentration data had greater worth than temperature data in reducing predictive uncertainty in this case. Results also indicated that a linear method could be used to quantify data worth in a nonlinear model. ^ The second hydrological analysis utilized a model to identify the transient response of the salinity, temperature, age, and amount of submarine groundwater discharge to changes in tidal ocean stage, seasonal temperature variations, and different types of geology. The model was compared to multiple kinds of data to (1) calibrate and verify the model, and (2) explore the potential for the model to be used to guide the collection of data using techniques such as electromagnetic resistivity, thermal imagery, and seepage meters. Results indicated that the model can be used to give insight to submarine groundwater discharge and be used to guide data collection. ^
Resumo:
A study has been conducted to investigate current practices on decision-making under risk and uncertainty for infrastructure project investments. It was found that many European countries such as the UK, France, Germany including Australia use scenarios for the investigation of the effects of risk and uncertainty of project investments. Different alternative scenarios are mostly considered during the engineering economic cost-benefit analysis stage. For instance, the World Bank requires an analysis of risks in all project appraisals. Risk in economic evaluation needs to be addressed by calculating sensitivity of the rate of return for a number of events. Risks and uncertainties of project developments arise from various sources of errors including data, model and forecasting errors. It was found that the most influential factors affecting risk and uncertainty resulted from forecasting errors. Data errors and model errors have trivial effects. It was argued by many analysts that scenarios do not forecast what will happen but scenarios indicate only what can happen from given alternatives. It was suggested that the probability distributions of end-products of the project appraisal, such as cost-benefit ratios that take forecasting errors into account, are feasible decision tools for economic evaluation. Political, social, environmental as well as economic and other related risk issues have been addressed and included in decision-making frameworks, such as in a multi-criteria decisionmaking framework. But no suggestion has been made on how to incorporate risk into the investment decision-making process.
Resumo:
Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.
Resumo:
Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.