989 resultados para uncertainty estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[1] In many practical situations where spatial rainfall estimates are needed, rainfall occurs as a spatially intermittent phenomenon. An efficient geostatistical method for rainfall estimation in the case of intermittency has previously been published and comprises the estimation of two independent components: a binary random function for modeling the intermittency and a continuous random function that models the rainfall inside the rainy areas. The final rainfall estimates are obtained as the product of the estimates of these two random functions. However the published approach does not contain a method for estimation of uncertainties. The contribution of this paper is the presentation of the indicator maximum likelihood estimator from which the local conditional distribution of the rainfall value at any location may be derived using an ensemble approach. From the conditional distribution, representations of uncertainty such as the estimation variance and confidence intervals can be obtained. An approximation to the variance can be calculated more simply by assuming rainfall intensity is independent of location within the rainy area. The methodology has been validated using simulated and real rainfall data sets. The results of these case studies show good agreement between predicted uncertainties and measured errors obtained from the validation data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a modified conditional logit model that takes account of uncertainty associated with mis-reporting in revealed preference experiments estimating willingness-to-pay (WTP). Like Hausman et al. [Journal of Econometrics (1988) Vol. 87, pp. 239-269], our model captures the extent and direction of uncertainty by respondents. Using a Bayesian methodology, we apply our model to a choice modelling (CM) data set examining UK consumer preferences for non-pesticide food. We compare the results of our model with the Hausman model. WTP estimates are produced for different groups of consumers and we find that modified estimates of WTP, that take account of mis-reporting, are substantially revised downwards. We find a significant proportion of respondents mis-reporting in favour of the non-pesticide option. Finally, with this data set, Bayes factors suggest that our model is preferred to the Hausman model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biologists frequently attempt to infer the character states at ancestral nodes of a phylogeny from the distribution of traits observed in contemporary organisms. Because phylogenies are normally inferences from data, it is desirable to account for the uncertainty in estimates of the tree and its branch lengths when making inferences about ancestral states or other comparative parameters. Here we present a general Bayesian approach for testing comparative hypotheses across statistically justified samples of phylogenies, focusing on the specific issue of reconstructing ancestral states. The method uses Markov chain Monte Carlo techniques for sampling phylogenetic trees and for investigating the parameters of a statistical model of trait evolution. We describe how to combine information about the uncertainty of the phylogeny with uncertainty in the estimate of the ancestral state. Our approach does not constrain the sample of trees only to those that contain the ancestral node or nodes of interest, and we show how to reconstruct ancestral states of uncertain nodes using a most-recent-common-ancestor approach. We illustrate the methods with data on ribonuclease evolution in the Artiodactyla. Software implementing the methods ( BayesMultiState) is available from the authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimating snow mass at continental scales is difficult but important for understanding landatmosphere interactions, biogeochemical cycles and Northern latitudes’ hydrology. Remote sensing provides the only consistent global observations, but the uncertainty in measurements is poorly understood. Existing techniques for the remote sensing of snow mass are based on the Chang algorithm, which relates the absorption of Earth-emitted microwave radiation by a snow layer to the snow mass within the layer. The absorption also depends on other factors such as the snow grain size and density, which are assumed and fixed within the algorithm. We examine the assumptions, compare them to field measurements made at the NASA Cold Land Processes Experiment (CLPX) Colorado field site in 2002–3, and evaluate the consequences of deviation and variability for snow mass retrieval. The accuracy of the emission model used to devise the algorithm also has an impact on its accuracy, so we test this with the CLPX measurements of snow properties against SSM/I and AMSR-E satellite measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken apart from a single Guidance Note (GN5, RICS 2003) stressing the importance of recognising uncertainty in valuation but not proffering any particular solution. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Valuation is often said to be “an art not a science” but this relates to the techniques employed to calculate value not to the underlying concept itself. Valuation is the process of estimating price in the market place. Yet, such an estimation will be affected by uncertainties. Uncertainty in the comparable information available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the valuation. The degree of the uncertainties will vary according to the level of market activity; the more active a market, the more credence will be given to the input information. In the UK at the moment the Royal Institution of Chartered Surveyors (RICS) is considering ways in which the uncertainty of the output figure, the valuation, can be conveyed to the use of the valuation, but as yet no definitive view has been taken. One of the major problems is that Valuation models (in the UK) are based upon comparable information and rely upon single inputs. They are not probability based, yet uncertainty is probability driven. In this paper, we discuss the issues underlying uncertainty in valuations and suggest a probability-based model (using Crystal Ball) to address the shortcomings of the current model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three wind gust estimation (WGE) methods implemented in the numerical weather prediction (NWP) model COSMO-CLM are evaluated with respect to their forecast quality using skill scores. Two methods estimate gusts locally from mean wind speed and the turbulence state of the atmosphere, while the third one considers the mixing-down of high momentum within the planetary boundary layer (WGE Brasseur). One hundred and fifty-eight windstorms from the last four decades are simulated and results are compared with gust observations at 37 stations in Germany. Skill scores reveal that the local WGE methods show an overall better behaviour, whilst WGE Brasseur performs less well except for mountain regions. The here introduced WGE turbulent kinetic energy (TKE) permits a probabilistic interpretation using statistical characteristics of gusts at observational sites for an assessment of uncertainty. The WGE TKE formulation has the advantage of a ‘native’ interpretation of wind gusts as result of local appearance of TKE. The inclusion of a probabilistic WGE TKE approach in NWP models has, thus, several advantages over other methods, as it has the potential for an estimation of uncertainties of gusts at observational sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In numerical weather prediction, parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known; and the parameterisations themselves are also approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation (DA), such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential DA methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to pre-determined functional forms of missing physics or parameterisations that are based upon prior information. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. Furthermore, it is shown how the method depends on the quality of the DA results. The results indicate that this new method is a powerful tool in systematic model improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents an approach to combine uncertainties of the hydrological model outputs predicted from a number of machine learning models. The machine learning based uncertainty prediction approach is very useful for estimation of hydrological models' uncertainty in particular hydro-metrological situation in real-time application [1]. In this approach the hydrological model realizations from Monte Carlo simulations are used to build different machine learning uncertainty models to predict uncertainty (quantiles of pdf) of the a deterministic output from hydrological model . Uncertainty models are trained using antecedent precipitation and streamflows as inputs. The trained models are then employed to predict the model output uncertainty which is specific for the new input data. We used three machine learning models namely artificial neural networks, model tree, locally weighted regression to predict output uncertainties. These three models produce similar verification results, which can be improved by merging their outputs dynamically. We propose an approach to form a committee of the three models to combine their outputs. The approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model in the Brue catchment in UK and the Bagmati catchment in Nepal. The verification results show that merged output is better than an individual model output. [1] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press, 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing human demands on soil-derived ecosystem services requires reliable data on global soil resources for sustainable development. The soil organic carbon (SOC) pool is a key indicator of soil quality as it affects essential biological, chemical and physical soil functions such as nutrient cycling, pesticide and water retention, and soil structure maintenance. However, information on the SOC pool, and its temporal and spatial dynamics is unbalanced. Even in well-studied regions with a pronounced interest in environmental issues information on soil carbon (C) is inconsistent. Several activities for the compilation of global soil C data are under way. However, different approaches for soil sampling and chemical analyses make even regional comparisons highly uncertain. Often, the procedures used so far have not allowed the reliable estimation of the total SOC pool, partly because the available knowledge is focused on not clearly defined upper soil horizons and the contribution of subsoil to SOC stocks has been less considered. Even more difficult is quantifying SOC pool changes over time. SOC consists of variable amounts of labile and recalcitrant molecules of plant, and microbial and animal origin that are often operationally defined. A comprehensively active soil expert community needs to agree on protocols of soil surveying and lab procedures towards reliable SOC pool estimates. Already established long-term ecological research sites, where SOC changes are quantified and the underlying mechanisms are investigated, are potentially the backbones for regional, national, and international SOC monitoring programs. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] We present in this paper a variational approach to accurately estimate simultaneously the velocity field and its derivatives directly from PIV image sequences. Our method differs from other techniques that have been presented in the literature in the fact that the energy minimization used to estimate the particles motion depends on a second order Taylor development of the flow. In this way, we are not only able to compute the motion vector field, but we also obtain an accurate estimation of their derivatives. Hence, we avoid the use of numerical schemes to compute the derivatives from the estimated flow that usually yield to numerical amplification of the inherent uncertainty on the estimated flow. The performance of our approach is illustrated with the estimation of the motion vector field and the vorticity on both synthetic and real PIV datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Submicroscopic changes in chromosomal DNA copy number dosage are common and have been implicated in many heritable diseases and cancers. Recent high-throughput technologies have a resolution that permits the detection of segmental changes in DNA copy number that span thousands of basepairs across the genome. Genome-wide association studies (GWAS) may simultaneously screen for copy number-phenotype and SNP-phenotype associations as part of the analytic strategy. However, genome-wide array analyses are particularly susceptible to batch effects as the logistics of preparing DNA and processing thousands of arrays often involves multiple laboratories and technicians, or changes over calendar time to the reagents and laboratory equipment. Failure to adjust for batch effects can lead to incorrect inference and requires inefficient post-hoc quality control procedures that exclude regions that are associated with batch. Our work extends previous model-based approaches for copy number estimation by explicitly modeling batch effects and using shrinkage to improve locus-specific estimates of copy number uncertainty. Key features of this approach include the use of diallelic genotype calls from experimental data to estimate batch- and locus-specific parameters of background and signal without the requirement of training data. We illustrate these ideas using a study of bipolar disease and a study of chromosome 21 trisomy. The former has batch effects that dominate much of the observed variation in quantile-normalized intensities, while the latter illustrates the robustness of our approach to datasets where as many as 25% of the samples have altered copy number. Locus-specific estimates of copy number can be plotted on the copy-number scale to investigate mosaicism and guide the choice of appropriate downstream approaches for smoothing the copy number as a function of physical position. The software is open source and implemented in the R package CRLMM available at Bioconductor (http:www.bioconductor.org).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.