983 resultados para Probabilistic analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International Scientific Forum, ISF 2013, ISF 2013, 12-14 December 2013, Tirana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon, Portugal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern real-time systems, with a more flexible and adaptive nature, demand approaches for timeliness evaluation based on probabilistic measures of meeting deadlines. In this context, simulation can emerge as an adequate solution to understand and analyze the timing behaviour of actual systems. However, care must be taken with the obtained outputs under the penalty of obtaining results with lack of credibility. Particularly important is to consider that we are more interested in values from the tail of a probability distribution (near worst-case probabilities), instead of deriving confidence on mean values. We approach this subject by considering the random nature of simulation output data. We will start by discussing well known approaches for estimating distributions out of simulation output, and the confidence which can be applied to its mean values. This is the basis for a discussion on the applicability of such approaches to derive confidence on the tail of distributions, where the worst-case is expected to be.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time scheduling usually considers worst-case values for the parameters of task (or message stream) sets, in order to provide safe schedulability tests for hard real-time systems. However, worst-case conditions introduce a level of pessimism that is often inadequate for a certain class of (soft) real-time systems. In this paper we provide an approach for computing the stochastic response time of tasks where tasks have inter-arrival times described by discrete probabilistic distribution functions, instead of minimum inter-arrival (MIT) values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of distributed generation and smart grid research works are dedicated to network operation parameters studies, reliability, etc. However, many of these works normally uses traditional test systems, for instance, IEEE test systems. This paper proposes voltage magnitude and reliability studies in presence of fault conditions, considering realistic conditions found in countries like Brazil. The methodology considers a hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models and a remedial action algorithm which is based on optimal power flow. To illustrate the application of the proposed method, the paper includes a case study that considers a real 12-bus sub-transmission network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In developed countries, civil infrastructures are one of the most significant investments of governments, corporations, and individuals. Among these, transportation infrastructures, including highways, bridges, airports, and ports, are of huge importance, both economical and social. Most developed countries have built a fairly complete network of highways to fit their needs. As a result, the required investment in building new highways has diminished during the last decade, and should be further reduced in the following years. On the other hand, significant structural deteriorations have been detected in transportation networks, and a huge investment is necessary to keep these infrastructures safe and serviceable. Due to the significant importance of bridges in the serviceability of highway networks, maintenance of these structures plays a major role. In this paper, recent progress in probabilistic maintenance and optimization strategies for deteriorating civil infrastructures with emphasis on bridges is summarized. A novel model including interaction between structural safety analysis,through the safety index, and visual inspections and non destructive tests, through the condition index, is presented. Single objective optimization techniques leading to maintenance strategies associated with minimum expected cumulative cost and acceptable levels of condition and safety are presented. Furthermore, multi-objective optimization is used to simultaneously consider several performance indicators such as safety, condition, and cumulative cost. Realistic examples of the application of some of these techniques and strategies are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: High-grade gliomas are aggressive, incurable tumors characterized by extensive diffuse invasion of the normal brain parenchyma. Novel therapies at best prolong survival; their costs are formidable and benefit is marginal. Economic restrictions thus require knowledge of the cost-effectiveness of treatments. Here, we show the cost-effectiveness of enhanced resections in malignant glioma surgery using a well-characterized tool for intraoperative tumor visualization, 5-aminolevulinic acid (5-ALA). OBJECTIVE: To evaluate the cost-effectiveness of 5-ALA fluorescence-guided neurosurgery compared with white-light surgery in adult patients with newly diagnosed high-grade glioma, adopting the perspective of the Portuguese National Health Service. METHODS: We used a Markov model (cohort simulation). Transition probabilities were estimated with the use of data from 1 randomized clinical trial and 1 noninterventional prospective study. Utility values and resource use were obtained from published literature and expert opinion. Unit costs were taken from official Portuguese reimbursement lists (2012 values). The health outcomes considered were quality-adjusted life-years, lifeyears, and progression-free life-years. Extensive 1-way and probabilistic sensitivity analyses were performed. RESULTS: The incremental cost-effectiveness ratios are below €10 000 in all evaluated outcomes, being around €9100 per quality-adjusted life-year gained, €6700 per life-year gained, and €8800 per progression-free life-year gained. The probability of 5-ALA fluorescence-guided surgery cost-effectiveness at a threshold of €20000 is 96.0% for quality-adjusted life-year, 99.6% for life-year, and 98.8% for progression-free life-year. CONCLUSION: 5-ALA fluorescence-guided surgery appears to be cost-effective in newly diagnosed high-grade gliomas compared with white-light surgery. This example demonstrates cost-effectiveness analyses for malignant glioma surgery to be feasible on the basis of existing data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel framework for probabilistic-based structural assessment of existing structures, which combines model identification and reliability assessment procedures, considering in an objective way different sources of uncertainty, is presented in this paper. A short description of structural assessment applications, provided in literature, is initially given. Then, the developed model identification procedure, supported in a robust optimization algorithm, is presented. Special attention is given to both experimental and numerical errors, to be considered in this algorithm convergence criterion. An updated numerical model is obtained from this process. The reliability assessment procedure, which considers a probabilistic model for the structure in analysis, is then introduced, incorporating the results of the model identification procedure. The developed model is then updated, as new data is acquired, through a Bayesian inference algorithm, explicitly addressing statistical uncertainty. Finally, the developed framework is validated with a set of reinforced concrete beams, which were loaded up to failure in laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.