945 resultados para initialization uncertainty
Resumo:
OpenMI is a widely used standard allowing exchange of data between integrated models, which has mostly been applied to dynamic, deterministic models. Within the FP7 UncertWeb project we are developing mechanisms and tools to support the management of uncertainty in environmental models. In this paper we explore the integration of the UncertWeb framework with OpenMI, to assess the issues that arise when propagating uncertainty in OpenMI model compositions, and the degree of integration possible with UncertWeb tools. In particular we develop an uncertainty-enabled model for a simple Lotka-Volterra system with an interface conforming to the OpenMI standard, exploring uncertainty in the initial predator and prey levels, and the parameters of the model equations. We use the Elicitator tool developed within UncertWeb to identify the initial condition uncertainties, and show how these can be integrated, using UncertML, with simple Monte Carlo propagation mechanisms. The mediators we develop for OpenMI models are generic and produce standard Web services that expose the OpenMI models to a Web based framework. We discuss what further work is needed to allow a more complete system to be developed and show how this might be used practically.
Resumo:
Error and uncertainty in remotely sensed data come from several sources, and can be increased or mitigated by the processing to which that data is subjected (e.g. resampling, atmospheric correction). Historically the effects of such uncertainty have only been considered overall and evaluated in a confusion matrix which becomes high-level meta-data, and so is commonly ignored. However, some of the sources of uncertainty can be explicity identified and modelled, and their effects (which often vary across space and time) visualized. Others can be considered overall, but their spatial effects can still be visualized. This process of visualization is of particular value for users who need to assess the importance of data uncertainty for their own practical applications. This paper describes a Java-based toolkit, which uses interactive and linked views to enable visualization of data uncertainty by a variety of means. This allows users to consider error and uncertainty as integral elements of image data, to be viewed and explored, rather than as labels or indices attached to the data. © 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Portfolio analysis exists, perhaps, as long, as people think about acceptance of rational decisions connected with use of the limited resources. However the occurrence moment of portfolio analysis can be dated precisely enough is having connected it with a publication of pioneer work of Harry Markovittz (Markovitz H. Portfolio Selection) in 1952. The model offered in this work, simple enough in essence, has allowed catching the basic features of the financial market, from the point of view of the investor, and has supplied the last with the tool for development of rational investment decisions. The central problem in Markovitz theory is the portfolio choice that is a set of operations. Thus in estimation, both separate operations and their portfolios two major factors are considered: profitableness and risk of operations and their portfolios. The risk thus receives a quantitative estimation. The account of mutual correlation dependences between profitablenesses of operations appears the essential moment in the theory. This account allows making effective diversification of portfolio, leading to essential decrease in risk of a portfolio in comparison with risk of the operations included in it. At last, the quantitative characteristic of the basic investment characteristics allows defining and solving a problem of a choice of an optimum portfolio in the form of a problem of quadratic optimization.
Resumo:
The so called “Plural Uncertainty Model” is considered, in which statistical, maxmin, interval and Fuzzy model of uncertainty are embedded. For the last case external and internal contradictions of the theory are investigated and the modified definition of the Fuzzy Sets is proposed to overcome the troubles of the classical variant of Fuzzy Subsets by L. Zadeh. The general variants of logit- and probit- regression are the model of the modified Fuzzy Sets. It is possible to say about observations within the modification of the theory. The conception of the “situation” is proposed within modified Fuzzy Theory and the classifying problem is considered. The algorithm of the classification for the situation is proposed being the analogue of the statistical MLM(maximum likelihood method). The example related possible observing the distribution from the collection of distribution is considered.
Resumo:
The question of forming aim-oriented description of an object domain of decision support process is outlined. Two main problems of an estimation and evaluation of data and knowledge uncertainty in decision support systems – straight and reverse, are formulated. Three conditions being the formalized criteria of aimoriented constructing of input, internal and output spaces of some decision support system are proposed. Definitions of appeared and hidden data uncertainties on some measuring scale are given.
Resumo:
In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.
Resumo:
Often it is commercial, not technological, factors which hinder the adoption of potentially valuable innovations. In energy policy, much attention is given to analysing and incentivising consumer demand for renewable energy, but new technologies may also need new supply markets, to provide products and services to build, operate and maintain the innovative technology. This paper addresses the impact of supply constraints on the long-term viability of sustainability related innovations, using the case of bioenergy from organic waste. Uncertainties in the pricing and availability of feedstock (i.e. waste) may generate market deadlock and deter potential investors. We draw on prior research to conceptualise the problem, and identify what steps might be taken to address it. We propose a research agenda aimed at purchasing and supply scholars and centred on the need to understand better the interplay between market evolution and supply uncertainty and 'market shaping' - how stakeholders can legitimately influence supply market evolution - to support the adoption of sustainability related innovation.
Resumo:
In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.
Resumo:
Modern advances in technology have led to more complex manufacturing processes whose success centres on the ability to control these processes with a very high level of accuracy. Plant complexity inevitably leads to poor models that exhibit a high degree of parametric or functional uncertainty. The situation becomes even more complex if the plant to be controlled is characterised by a multivalued function or even if it exhibits a number of modes of behaviour during its operation. Since an intelligent controller is expected to operate and guarantee the best performance where complexity and uncertainty coexist and interact, control engineers and theorists have recently developed new control techniques under the framework of intelligent control to enhance the performance of the controller for more complex and uncertain plants. These techniques are based on incorporating model uncertainty. The newly developed control algorithms for incorporating model uncertainty are proven to give more accurate control results under uncertain conditions. In this paper, we survey some approaches that appear to be promising for enhancing the performance of intelligent control systems in the face of higher levels of complexity and uncertainty.
Resumo:
This study examined the relations between anxiety and individual characteristics of sensory sensitivity (SS) and intolerance of uncertainty (IU) in mothers of children with ASD. The mothers of 50 children completed the Hospital Anxiety and Depression Scale, the Highly Sensitive Person Scale and the IU Scale. Anxiety was associated with both SS and IU and IU was also associated with SS. Mediation analyses showed direct effects between anxiety and both IU and SS but a significant indirect effect was found only in the model in which IU mediated between SS. This is the first study to characterize the nature of the IU and SS interrelation in predicting levels of anxiety.
Resumo:
This paper describes a method of uncertainty evaluation for axi-symmetric measurement machines which is compliant with GUM and PUMA methodologies. Specialized measuring machines for the inspection of axisymmetric components enable the measurement of properties such as roundness (radial runout), axial runout and coning. These machines typically consist of a rotary table and a number of contact measurement probes located on slideways. Sources of uncertainty include the probe calibration process, probe repeatability, probe alignment, geometric errors in the rotary table, the dimensional stability of the structure holding the probes and form errors in the reference hemisphere which is used to calibrate the system. The generic method is described and an evaluation of an industrial machine is described as a worked example. Type A uncertainties were obtained from a repeatability study of the probe calibration process, a repeatability study of the actual measurement process, a system stability test and an elastic deformation test. Type B uncertainties were obtained from calibration certificates and estimates. Expanded uncertainties, at 95% confidence, were then calculated for the measurement of; radial runout (1.2 µm with a plunger probe or 1.7 µm with a lever probe); axial runout (1.2 µm with a plunger probe or 1.5 µm with a lever probe); and coning/swash (0.44 arc seconds with a plunger probe or 0.60 arc seconds with a lever probe).
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
As machine tools continue to become increasingly repeatable and accurate, high-precision manufacturers may be tempted to consider how they might utilise machine tools as measurement systems. In this paper, we have explored this paradigm by attempting to repurpose state-of-the-art coordinate measuring machine Uncertainty Evaluating Software (UES) for a machine tool application. We performed live measurements on all the systems in question. Our findings have highlighted some gaps with UES when applied to machine tools, and we have attempted to identify the sources of variation which have led to discrepancies. Implications of this research include requirements to evolve the algorithms within the UES if it is to be adapted for on-machine measurement, improve the robustness of the input parameters, and most importantly, clarify expectations.
Resumo:
The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.