996 resultados para probabilistic risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to evaluate the reliability of these levee systems, calculating the probability of “failure” of determined levee stretches under different loads, using probabilistic methods that take into account the fragility curves obtained through the Monte Carlo Method. For this study overtopping and piping are considered as failure mechanisms (since these are the most frequent) and the major levee system of the Po River with a primary focus on the section between Piacenza and Cremona, in the lower-middle area of the Padana Plain, is analysed. The novelty of this approach is to check the reliability of individual embankment stretches, not just a single section, while taking into account the variability of the levee system geometry from one stretch to another. This work takes also into consideration, for each levee stretch analysed, a probability distribution of the load variables involved in the definition of the fragility curves, where it is influenced by the differences in the topography and morphology of the riverbed along the sectional depth analysed as it pertains to the levee system in its entirety. A type of classification is proposed, for both failure mechanisms, to give an indication of the reliability of the levee system based of the information obtained by the fragility curve analysis. To accomplish this work, an hydraulic model has been developed where a 500-year flood is modelled to determinate the residual hazard value of failure for each stretch of levee near the corresponding water depth, then comparing the results with the obtained classifications. This work has the additional the aim of acting as an interface between the world of Applied Geology and Environmental Hydraulic Engineering where a strong collaboration is needed between the two professions to resolve and improve the estimation of hydraulic risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection of metrics for ecosystem restoration programs is critical for improving the quality of monitoring programs and characterizing project success. Moreover it is oftentimes very difficult to balance the importance of multiple ecological, social, and economical metrics. Metric selection process is a complex and must simultaneously take into account monitoring data, environmental models, socio-economic considerations, and stakeholder interests. We propose multicriteria decision analysis (MCDA) methods, broadly defined, for the selection of optimal sets of metrics to enhance evaluation of ecosystem restoration alternatives. Two MCDA methods, a multiattribute utility analysis (MAUT), and a probabilistic multicriteria acceptability analysis (ProMAA), are applied and compared for a hypothetical case study of a river restoration involving multiple stakeholders. Overall, the MCDA results in a systematic, unbiased, and transparent solution, informing restoration alternatives evaluation. The two methods provide comparable results in terms of selected metrics. However, because ProMAA can consider probability distributions for weights and utility values of metrics for each criteria, it is suggested as the best option if data uncertainty is high. Despite the increase in complexity in the metric selection process, MCDA improves upon the current ad-hoc decision practice based on the consultations with stakeholders and experts, and encourages transparent and quantitative aggregation of data and judgement, increasing the transparency of decision making in restoration projects. We believe that MCDA can enhance the overall sustainability of ecosystem by enhancing both ecological and societal needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Probabilistic climate data have become available for the first time through the UK Climate Projections 2009, so that the risk of tree growth change can be quantified. We assess the drought risk spatially and temporally using drought probabilities and tree species vulnerabilities across Britain. We assessed the drought impact on the potential yield class of three major tree species (Picea sitchensis, Pinus sylvestris, and Quercus robur) which presently cover around 59% (400,700 ha) of state-managed forests, across lowland and upland sites. Here we show that drought impacts result mostly in reduced tree growth over the next 80 years when using b1, a1b and a1fi IPCC emissions scenarios. We found a maximum reduction of 94% but also a maximum increase of 56% in potential stand yield class in the 2080s from the baseline climate (1961-1990). Furthermore, potential production over the national forest estate for all three species in the 2080s may decrease due to drought by 42% in the lowlands and 32% in the uplands in comparison to the baseline climate. Our results reveal that potential tree growth and forest production on the national forest estate in Britain is likely to reduce, and indicate where and when adaptation measures are required. Moreover, this paper demonstrates the value of probabilistic climate projections for an important economic and environmental sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Probabilistic Safety Assessment (PSA) is being developed for a steam-methane reforming hydrogen production plant linked to a High-Temperature Gas Cooled Nuclear Reactor (HTGR). This work is based on the Japan Atomic Energy Research Institute’s (JAERI) High Temperature Test Reactor (HTTR) prototype in Japan. This study has two major objectives: calculate the risk to onsite and offsite individuals, and calculate the frequency of different types of damage to the complex. A simplified HAZOP study was performed to identify initiating events, based on existing studies. The initiating events presented here are methane pipe break, helium pipe break, and PPWC heat exchanger pipe break. Generic data was used for the fault tree analysis and the initiating event frequency. Saphire was used for the PSA analysis. The results show that the average frequency of an accident at this complex is 2.5E-06, which is divided into the various end states. The dominant sequences result in graphite oxidation which does not pose a health risk to the population. The dominant sequences that could affect the population are those that result in a methane explosion and occur 6.6E-8/year, while the other sequences are much less frequent. The health risk presents itself if there are people in the vicinity who could be affected by the explosion. This analysis also demonstrates that an accident in one of the plants has little effect on the other. This is true given the design base distance between the plants, the fact that the reactor is underground, as well as other safety characteristics of the HTGR. Sensitivity studies are being performed in order to determine where additional and improved data is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert knowledge is used to assign probabilities to events in many risk analysis models. However, experts sometimes find it hard to provide specific values for these probabilities, preferring to express vague or imprecise terms that are mapped using a previously defined fuzzy number scale. The rigidity of these scales generates bias in the probability elicitation process and does not allow experts to adequately express their probabilistic judgments. We present an interactive method for extracting a fuzzy number from experts that represents their probabilistic judgments for a given event, along with a quality measure of the probabilistic judgments, useful in a final information filtering and analysis sensitivity process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flash floods are of major relevance in natural disaster management in the Mediterranean region. In many cases, the damaging effects of flash floods can be mitigated by adequate management of flood control reservoirs. This requires the development of suitable models for optimal operation of reservoirs. A probabilistic methodology for calibrating the parameters of a reservoir flood control model (RFCM) that takes into account the stochastic variability of flood events is presented. This study addresses the crucial problem of operating reservoirs during flood events, considering downstream river damages and dam failure risk as conflicting operation criteria. These two criteria are aggregated into a single objective of total expected damages from both the maximum released flows and stored volumes (overall risk index). For each selected parameter set the RFCM is run under a wide range of hydrologic loads (determined through Monte Carlo simulation). The optimal parameter set is obtained through the overall risk index (balanced solution) and then compared with other solutions of the Pareto front. The proposed methodology is implemented at three different reservoirs in the southeast of Spain. The results obtained show that the balanced solution offers a good compromise between the two main objectives of reservoir flood control management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spanish seismic norm has a direct application in building construction but it recomends specific studies in the case of important public works such us large dams or bridges. For this reason, and to establish specifical criteria in its field of activity, the Dirección General de Obras Hidráulicas of the spanish Ministerio de Obras Públicas y Urbanismo commissioned us a seismotectonical and seismic risk study applicable to Spain, materialized on a series of maps of inmediate and direct use. In this paper we explain the methodology pursued to obtain these maps. It has required, firstly, investigations with the aim to improve the seismic information corresponding to the historical or preinstrumental period, that allowed more precise cuantifications. Secondly, these data have been processed by probabilistic methods, using de intensity as foundamental parameter. The corresponding maps have been developed. Finally, other maps of seismic accelerations have been compiled. La normative sismorresistent espagnole á une application directe dans 1'edification, mais elle recommend la realitation des etudes especifiques dans le cas des travaux publics importants telles que ponts ou barrages. Pour cette raison et pour etablir des criteres especifiques dans son camp dáctivité, la Direccion General de Obras Hidraulicas du Ministerio de Obras Publicas y Urbanismo espagnol a commande un etude sismotectonique et de risque sismique applicable a l'Espagne, materialise en une serie de cartes de utilization directe et inmediate. Dans cette communication on explique la methodologie a suivre pour la realization de cettes cartes. Celá a fait necessaire, d'abord, des recherches pour amelliorer l'information concernant le periode historique ou preinstrumental, ce qui permet des cuantifications plus precises. En second lieu ces faits ont ête traités avec des methodes probabilistes, employant l'intensité come parametre fondamental. En consequence nous avons developpe des cartes d'intensite et d'acceleration sismique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(ENG) IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses (SPA)DPSA (Metodologías Integradas de Análisis Determinista-Probabilista de Seguridad) es un conjunto de métodos que utilizan métodos probabilistas y deterministas estrechamente acoplados para abordar las respectivas fuentes de incertidumbre, permitiendo la toma de decisiones Informada por el Riesgo de forma consistente. El punto de inicio del marco IDPSA es que la justificación de seguridad debe estar basada en el acoplamiento entre consideraciones deterministas (consecuencias) y probabilistas (frecuencia) para abordar la interacción mutua entre perturbaciones estocásticas (como por ejemplo fallos de los equipos, acciones humanas, fenómenos físicos estocásticos) y la respuesta determinista de la planta (como por ejemplo los transitorios). Este artículo da una visión general de algunos métodos IDSPA así como posibles aplicaciones al análisis de seguridad de los PWR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the main steps necessary to evaluate the seismic risk on a site are discussed. Several examples from the authors practical experience are reported and a systematic procedure to study the seismic risk on a dam site is also shown. The characteristics of the available Spanish seismic information - mainly historical and non instrumental seismic records - are commented. Different types of seismic and geologic techniques to investigate the area under the dam are given. Finally, a probabilistic method to obtain from the given seismic intensities the design earthquake is summarized

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For non-negative random variables with finite means we introduce an analogous of the equilibrium residual-lifetime distribution based on the quantile function. This allows us to construct new distributions with support (0, 1), and to obtain a new quantile-based version of the probabilistic generalization of Taylor's theorem. Similarly, for pairs of stochastically ordered random variables we come to a new quantile-based form of the probabilistic mean value theorem. The latter involves a distribution that generalizes the Lorenz curve. We investigate the special case of proportional quantile functions and apply the given results to various models based on classes of distributions and measures of risk theory. Motivated by some stochastic comparisons, we also introduce the “expected reversed proportional shortfall order”, and a new characterization of random lifetimes involving the reversed hazard rate function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How can empirical evidence of adverse effects from exposure to noxious agents, which is often incomplete and uncertain, be used most appropriately to protect human health? We examine several important questions on the best uses of empirical evidence in regulatory risk management decision-making raised by the US Environmental Protection Agency (EPA)'s science-policy concerning uncertainty and variability in human health risk assessment. In our view, the US EPA (and other agencies that have adopted similar views of risk management) can often improve decision-making by decreasing reliance on default values and assumptions, particularly when causation is uncertain. This can be achieved by more fully exploiting decision-theoretic methods and criteria that explicitly account for uncertain, possibly conflicting scientific beliefs and that can be fully studied by advocates and adversaries of a policy choice, in administrative decision-making involving risk assessment. The substitution of decision-theoretic frameworks for default assumption-driven policies also allows stakeholder attitudes toward risk to be incorporated into policy debates, so that the public and risk managers can more explicitly identify the roles of risk-aversion or other attitudes toward risk and uncertainty in policy recommendations. Decision theory provides a sound scientific way explicitly to account for new knowledge and its effects on eventual policy choices. Although these improvements can complicate regulatory analyses, simplifying default assumptions can create substantial costs to society and can prematurely cut off consideration of new scientific insights (e.g., possible beneficial health effects from exposure to sufficiently low 'hormetic' doses of some agents). In many cases, the administrative burden of applying decision-analytic methods is likely to be more than offset by improved effectiveness of regulations in achieving desired goals. Because many foreign jurisdictions adopt US EPA reasoning and methods of risk analysis, it may be especially valuable to incorporate decision-theoretic principles that transcend local differences among jurisdictions.