998 resultados para Modeling uncertainty


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tässä diplomityössä on tutkittu epävarmuuden mallintamista investointilaskelmissa. Kirjallisuuden perusteella luotiin prosessimalli, jolla voidaan strukturoidusti tehdä yritysinvestointi- tai yritysirtaantumispäätös. Malli koostuu neljästä päävaiheesta, mutta pääpainopiste mallissa on laskentamenetelmissä. Luotua prosessimallia sekä erityisesti laskentamenetelmiä on sovellettu yritysesimerkin avulla. Epävarmuuden mallintamisongelmaa on käsitelty sekä perinteisten klassillisten investointiteorioiden että reaalioptioajatteluun pohjautuvien menetelmien avulla. Reaalioptioteoriaan perustuvien menetelmien avulla voidaan ottaa huomioon tulevat epävarmuudet ja päätöksentekomahdollisuudet. Perinteisten reaalioptioteorioiden käytännön elämän vastaisten taustaoletuksien vuoksi tutkittiin erityisesti uusimpia malleja. Diplomityössä yritysesimerkiksi valittiin Paroc Group, jonka yritysjärjestelytilannetta tutkittiin sen nykyisen omistajan eli pankin näkökulmasta. Diplomityön yhtenä keskeisenä tavoitteena oli selvittää, että kannattaako pankin myydä yhtiö tämän hetkisellä markkinahinnalla vai odottaa parempaa myyntiajankohtaa.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Different climate models, modeling methods and carbon emission scenarios were used in this paper to evaluate the effects of future climate changes on geographical distribution of species of economic and cultural importance across the Cerrado biome. As the results of several studies have shown, there are still many uncertainties associated with these projections, although bioclimatic models are still widely used and effective method to evaluate the consequences for biodiversity of these climate changes. In this article, it was found that 90% of these uncertainties are related to methods of modeling, although, regardless of the uncertainties, the results revealed that the studied species will reduce about 78% of their geographic distribution in Cerrado. For an effective work on the conservation of these species, many studies still need to be carried out, although it is already possible to observe that climate change will have a strong influence on the pattern of distribution of these species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydro-graph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Proton radiation therapy is gaining popularity because of the unique characteristics of its dose distribution, e.g., high dose-gradient at the distal end of the percentage-depth-dose curve (known as the Bragg peak). The high dose-gradient offers the possibility of delivering high dose to the target while still sparing critical organs distal to the target. However, the high dose-gradient is a double-edged sword: a small shift of the highly conformal high-dose area can cause the target to be substantially under-dosed or the critical organs to be substantially over-dosed. Because of that, large margins are required in treatment planning to ensure adequate dose coverage of the target, which prevents us from realizing the full potential of proton beams. Therefore, it is critical to reduce uncertainties in the proton radiation therapy. One major uncertainty in a proton treatment is the range uncertainty related to the estimation of proton stopping power ratio (SPR) distribution inside a patient. The SPR distribution inside a patient is required to account for tissue heterogeneities when calculating dose distribution inside the patient. In current clinical practice, the SPR distribution inside a patient is estimated from the patient’s treatment planning computed tomography (CT) images based on the CT number-to-SPR calibration curve. The SPR derived from a single CT number carries large uncertainties in the presence of human tissue composition variations, which is the major drawback of the current SPR estimation method. We propose to solve this problem by using dual energy CT (DECT) and hypothesize that the range uncertainty can be reduced by a factor of two from currently used value of 3.5%. A MATLAB program was developed to calculate the electron density ratio (EDR) and effective atomic number (EAN) from two CT measurements of the same object. An empirical relationship was discovered between mean excitation energies and EANs existing in human body tissues. With the MATLAB program and the empirical relationship, a DECT-based method was successfully developed to derive SPRs for human body tissues (the DECT method). The DECT method is more robust against the uncertainties in human tissues compositions than the current single-CT-based method, because the DECT method incorporated both density and elemental composition information in the SPR estimation. Furthermore, we studied practical limitations of the DECT method. We found that the accuracy of the DECT method using conventional kV-kV x-ray pair is susceptible to CT number variations, which compromises the theoretical advantage of the DECT method. Our solution to this problem is to use a different x-ray pair for the DECT. The accuracy of the DECT method using different combinations of x-ray energies, i.e., the kV-kV, kV-MV and MV-MV pair, was compared using the measured imaging uncertainties for each case. The kV-MV DECT was found to be the most robust against CT number variations. In addition, we studied how uncertainties propagate through the DECT calculation, and found general principles of selecting x-ray pairs for the DECT method to minimize its sensitivity to CT number variations. The uncertainties in SPRs estimated using the kV-MV DECT were analyzed further and compared to those using the stoichiometric method. The uncertainties in SPR estimation can be divided into five categories according to their origins: the inherent uncertainty, the DECT modeling uncertainty, the CT imaging uncertainty, the uncertainty in the mean excitation energy, and SPR variation with proton energy. Additionally, human body tissues were divided into three tissue groups – low density (lung) tissues, soft tissues and bone tissues. The uncertainties were estimated separately because their uncertainties were different under each condition. An estimate of the composite range uncertainty (2s) was determined for three tumor sites – prostate, lung, and head-and-neck, by combining the uncertainty estimates of all three tissue groups, weighted by their proportions along typical beam path for each treatment site. In conclusion, the DECT method holds theoretical advantages in estimating SPRs for human tissues over the current single-CT-based method. Using existing imaging techniques, the kV-MV DECT approach was capable of reducing the range uncertainty from the currently used value of 3.5% to 1.9%-2.3%, but it is short to reach our original goal of reducing the range uncertainty by a factor of two. The dominant source of uncertainties in the kV-MV DECT was the uncertainties in CT imaging, especially in MV CT imaging. Further reduction in beam hardening effect, the impact of scatter, out-of-field object etc. would reduce the Hounsfeld Unit variations in CT imaging. The kV-MV DECT still has the potential to reduce the range uncertainty further.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

40.00% 40.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a new approach for corpus-based speech enhancement that significantly improves over a method published by Xiao and Nickel in 2010. Corpus-based enhancement systems do not merely filter an incoming noisy signal, but resynthesize its speech content via an inventory of pre-recorded clean signals. The goal of the procedure is to perceptually improve the sound of speech signals in background noise. The proposed new method modifies Xiao's method in four significant ways. Firstly, it employs a Gaussian mixture model (GMM) instead of a vector quantizer in the phoneme recognition front-end. Secondly, the state decoding of the recognition stage is supported with an uncertainty modeling technique. With the GMM and the uncertainty modeling it is possible to eliminate the need for noise dependent system training. Thirdly, the post-processing of the original method via sinusoidal modeling is replaced with a powerful cepstral smoothing operation. And lastly, due to the improvements of these modifications, it is possible to extend the operational bandwidth of the procedure from 4 kHz to 8 kHz. The performance of the proposed method was evaluated across different noise types and different signal-to-noise ratios. The new method was able to significantly outperform traditional methods, including the one by Xiao and Nickel, in terms of PESQ scores and other objective quality measures. Results of subjective CMOS tests over a smaller set of test samples support our claims.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a general methodology for estimating and incorporating uncertainty in the controller and forward models for noisy nonlinear control problems. Conditional distribution modeling in a neural network context is used to estimate uncertainty around the prediction of neural network outputs. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localize the possible control solutions to consider. A nonlinear multivariable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non Gaussian distributions of control signal as well as processes with hysteresis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dynamically adaptive systems (DASs) are intended to monitor the execution environment and then dynamically adapt their behavior in response to changing environmental conditions. The uncertainty of the execution environment is a major motivation for dynamic adaptation; it is impossible to know at development time all of the possible combinations of environmental conditions that will be encountered. To date, the work performed in requirements engineering for a DAS includes requirements monitoring and reasoning about the correctness of adaptations, where the DAS requirements are assumed to exist. This paper introduces a goal-based modeling approach to develop the requirements for a DAS, while explicitly factoring uncertainty into the process and resulting requirements. We introduce a variation of threat modeling to identify sources of uncertainty and demonstrate how the RELAX specification language can be used to specify more flexible requirements within a goal model to handle the uncertainty. © 2009 Springer Berlin Heidelberg.