974 resultados para Orion DBMS, Database, Uncertainty, Uncertain values, Benchmark
Resumo:
A description of the radiation emitted by impurities from within a plasma is crucial if spectral line intensities are to be used in detailed studies, such as the analysis of impurity transport. The simplest and most direct check that can be made on measurements of line intensities is to analyse their ratios with other lines from the same ion. This avoids uncertainties in determining the volume of the emitting plasma and the absolute sensitivity calibration of the spectrometer and, in some cases, the need even for accurate measurements of parameters such as electron density. Consistency is required between the measured line intensity ratios and the theoretical values. The expected consistency has not been found for radiation emitted from the JET scrape-off layer (e.g. Lawson et al 2009a JINST 4 P04013), meaning that the description of the spectral line intensities of impurity emission from the plasma edge is incomplete. In order to gain further understanding of the discrepancies, an analysis has been carried out for emission from the JET divertor plasma and this is reported in this paper. Carbon was the main low Z intrinsic impurity in JET and an analysis of spectral line intensity ratios has been made for the C (IV) radiation emitted from the JET divertor. In this case, agreement is found between the measured and theoretical ratios to a very high accuracy, namely to within the experimental uncertainty of similar to +/- 10%. This confirms that the description of the line intensities for the present observations is complete. For some elements and ionization stages, an analysis of line intensity ratios can lead to the determination of parameters such as the electron temperature of the emitting plasma region and estimates of the contribution of recombination to the electron energy level populations. This applies to C (IV) and, to show the value and possibilities of the spectral measurements, these parameters have been calculated for a database of Ohmic and additionally heated phases of a large number of pulses. The importance of dielectronic, radiative and charge-exchange recombination as well as ionization has been investigated. In addition, the development of T-e throughout two example discharges is illustrated. The presented results indicate a number of areas for further investigation.
Resumo:
An important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we, therefore, propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decisionmaker’s attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process.
Resumo:
Correctly modelling and reasoning with uncertain information from heterogeneous sources in large-scale systems is critical when the reliability is unknown and we still want to derive adequate conclusions. To this end, context-dependent merging strategies have been proposed in the literature. In this paper we investigate how one such context-dependent merging strategy (originally defined for possibility theory), called largely partially maximal consistent subsets (LPMCS), can be adapted to Dempster-Shafer (DS) theory. We identify those measures for the degree of uncertainty and internal conflict that are available in DS theory and show how they can be used for guiding LPMCS merging. A simplified real-world power distribution scenario illustrates our framework. We also briefly discuss how our approach can be incorporated into a multi-agent programming language, thus leading to better plan selection and decision making.
Resumo:
This paper presents a multi-agent system approach to address the difficulties encountered in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission and distribution. The approach models uncertainty and combines multiple sources of uncertain information to deliver robust plan selection. We examine the approach in the context of a simplified power supply/demand scenario using a residential grid connected solar system and consider the challenges of modelling and reasoning with
uncertain sensor information in this environment. We discuss examples of plans and actions required for sensing, establish and discuss the effect of uncertainty on such systems and investigate different uncertainty theories and how they can fuse uncertain information from multiple sources for effective decision making in
such a complex system.
Resumo:
The BDI architecture, where agents are modelled based on their beliefs, desires and intentions, provides a practical approach to develop large scale systems. However, it is not well suited to model complex Supervisory Control And Data Acquisition (SCADA) systems pervaded by uncertainty. In this paper we address this issue by extending the operational semantics of Can(Plan) into Can(Plan)+. We start by modelling the beliefs of an agent as a set of epistemic states where each state, possibly using a different representation, models part of the agent's beliefs. These epistemic states are stratified to make them commensurable and to reason about the uncertain beliefs of the agent. The syntax and semantics of a BDI agent are extended accordingly and we identify fragments with computationally efficient semantics. Finally, we examine how primitive actions are affected by uncertainty and we define an appropriate form of lookahead planning.
Resumo:
The assimilation of discrete higher fidelity data points with model predictions can be used to achieve a reduction in the uncertainty of the model input parameters which generate accurate predictions. The problem investigated here involves the prediction of limit-cycle oscillations using a High-Dimensional Harmonic Balance method (HDHB). The efficiency of the HDHB method is exploited to enable calibration of structural input parameters using a Bayesian inference technique. Markov-chain Monte Carlo is employed to sample the posterior distributions. Parameter estimation is carried out on both a pitch/plunge aerofoil and Goland wing configuration. In both cases significant refinement was achieved in the distribution of possible structural parameters allowing better predictions of their
true deterministic values.
Resumo:
Uncertainty profiles are used to study the effects of contention within cloud and service-based environments. An uncertainty profile provides a qualitative description of an environment whose quality of service (QoS) may fluctuate unpredictably. Uncertain environments are modelled by strategic games with two agents; a daemon is used to represent overload and high resource contention; an angel is used to represent an idealised resource allocation situation with no underlying contention. Assessments of uncertainty profiles are useful in two ways: firstly, they provide a broad understanding of how environmental stress can effect an application’s performance (and reliability); secondly, they allow the effects of introducing redundancy into a computation to be assessed
Resumo:
A framework for assessing the robustness of long-duration repetitive orchestrations in uncertain evolving environments is proposed. The model assumes that service-based evaluation environments are stable over short time-frames only; over longer periods service-based environments evolve as demand fluctuates and contention for shared resources varies. The behaviour of a short-duration orchestration E in a stable environment is assessed by an uncertainty profile U and a corresponding zero-sum angel-daemon game Γ(U) [2]. Here the angel-daemon approach is extended to assess evolving environments by means of a subfamily of stochastic games. These games are called strategy oblivious because their transition probabilities are strategy independent. It is shown that the value of a strategy oblivious stochastic game is well defined and that it can be computed by solving a linear system. Finally, the proposed stochastic framework is used to assess the evolution of the Gabrmn IT system.
Resumo:
The newly updated inventory of palaeoecological research in Latin America offers an important overview of sites available for multi-proxy and multi-site purposes. From the collected literature supporting this inventory, we collected all available age model metadata to create a chronological database of 5116 control points (e.g. 14C, tephra, fission track, OSL, 210Pb) from 1097 pollen records. Based on this literature review, we present a summary of chronological dating and reporting in the Neotropics. Difficulties and recommendations for chronology reporting are discussed. Furthermore, for 234 pollen records in northwest South America, a classification system for age uncertainties is implemented based on chronologies generated with updated calibration curves. With these outcomes age models are produced for those sites without an existing chronology, alternative age models are provided for researchers interested in comparing the effects of different calibration curves and age–depth modelling software, and the importance of uncertainty assessments of chronologies is highlighted. Sample resolution and temporal uncertainty of ages are discussed for different time windows, focusing on events relevant for research on centennial- to millennial-scale climate variability. All age models and developed R scripts are publicly available through figshare, including a manual to use the scripts.
Resumo:
In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In life cycle impact assessment (LCIA) models, the sorption of the ionic fraction of dissociating organic chemicals is not adequately modeled because conventional non-polar partitioning models are applied. Therefore, high uncertainties are expected when modeling the mobility, as well as the bioavailability for uptake by exposed biota and degradation, of dissociating organic chemicals. Alternative regressions that account for the ionized fraction of a molecule to estimate fate parameters were applied to the USEtox model. The most sensitive model parameters in the estimation of ecotoxicological characterization factors (CFs) of micropollutants were evaluated by Monte Carlo analysis in both the default USEtox model and the alternative approach. Negligible differences of CFs values and 95% confidence limits between the two approaches were estimated for direct emissions to the freshwater compartment; however the default USEtox model overestimates CFs and the 95% confidence limits of basic compounds up to three orders and four orders of magnitude, respectively, relatively to the alternative approach for emissions to the agricultural soil compartment. For three emission scenarios, LCIA results show that the default USEtox model overestimates freshwater ecotoxicity impacts for the emission scenarios to agricultural soil by one order of magnitude, and larger confidence limits were estimated, relatively to the alternative approach.
Resumo:
The problem of uncertainty propagation in composite laminate structures is studied. An approach based on the optimal design of composite structures to achieve a target reliability level is proposed. Using the Uniform Design Method (UDM), a set of design points is generated over a design domain centred at mean values of random variables, aimed at studying the space variability. The most critical Tsai number, the structural reliability index and the sensitivities are obtained for each UDM design point, using the maximum load obtained from optimal design search. Using the UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on supervised evolutionary learning. Finally, using the developed ANN a Monte Carlo simulation procedure is implemented and the variability of the structural response based on global sensitivity analysis (GSA) is studied. The GSA is based on the first order Sobol indices and relative sensitivities. An appropriate GSA algorithm aiming to obtain Sobol indices is proposed. The most important sources of uncertainty are identified.