971 resultados para Process uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life cycle assessment (LCA) is used to estimate a product's environmental impact. Using LCA during the earlier stages of design may produce erroneous results since information available on the product's lifecycle is typically incomplete at these stages. The resulting uncertainty must be accounted for in the decision-making process. This paper proposes a method for estimating the environmental impact of a product's life cycle and the associated degree of uncertainty of that impact using information generated during the design process. Total impact is estimated based on aggregation of individual product life cycle processes impacts. Uncertainty estimation is based on assessing the mismatch between the information required and the information available about the product life cycle in each uncertainty category, as well as their integration. The method is evaluated using pre-defined scenarios with varying uncertainty. DOI: 10.1115/1.4002163]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

41 p.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications.

Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake.

To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that can capture the uncertainties in EEW information and the decision process is used. This approach is called the Performance-Based Earthquake Early Warning, which is based on the PEER Performance-Based Earthquake Engineering method. Use of surrogate models is suggested to improve computational efficiency. Also, new models are proposed to add the influence of lead time into the cost-benefit analysis. For example, a value of information model is used to quantify the potential value of delaying the activation of a mitigation action for a possible reduction of the uncertainty of EEW information in the next update. Two practical examples, evacuation alert and elevator control, are studied to illustrate the ePAD framework. Potential advanced EEW applications, such as the case of multiple-action decisions and the synergy of EEW and structural health monitoring systems, are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Housing stock models can be useful tools in helping to assess the environmental and socio-economic impacts of retrofits to residential buildings; however, existing housing stock models are not able to quantify the uncertainties that arise in the modelling process from various sources, thus limiting the role that they can play in helping decision makers. This paper examines the different sources of uncertainty involved in housing stock models and proposes a framework for handling these uncertainties. This framework involves integrating probabilistic sensitivity analysis with a Bayesian calibration process in order to quantify uncertain parameters more accurately. The proposed framework is tested on a case study building, and suggestions are made on how to expand the framework for retrofit analysis at an urban-scale. © 2011 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design work involves uncertainty that arises from, and influences, the progressive development of solutions. This paper analyses the influences of evolving uncertainty levels on the design process. We focus on uncertainties associated with choosing the values of design parameters, and do not consider in detail the issues that arise when parameters must first be identified. Aspects of uncertainty and its evolution are discussed, and a new task-based model is introduced to describe process behaviour in terms of changing uncertainty levels. The model is applied to study two process configuration problems based on aircraft wing design: one using an analytical solution and one using Monte-Carlo simulation. The applications show that modelling uncertainty levels during design can help assess management policies, such as how many concepts should be considered during design and to what level of accuracy. © 2011 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coupled hydrology and water quality models are an important tool today, used in the understanding and management of surface water and watershed areas. Such problems are generally subject to substantial uncertainty in parameters, process understanding, and data. Component models, drawing on different data, concepts, and structures, are affected differently by each of these uncertain elements. This paper proposes a framework wherein the response of component models to their respective uncertain elements can be quantified and assessed, using a hydrological model and water quality model as two exemplars. The resulting assessments can be used to identify model coupling strategies that permit more appropriate use and calibration of individual models, and a better overall coupled model response. One key finding was that an approximate balance of water quality and hydrological model responses can be obtained using both the QUAL2E and Mike11 water quality models. The balance point, however, does not support a particularly narrow surface response (or stringent calibration criteria) with respect to the water quality calibration data, at least in the case examined here. Additionally, it is clear from the results presented that the structural source of uncertainty is at least as significant as parameter-based uncertainties in areal models. © 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many real world situations, we make decisions in the presence of multiple, often conflicting and non-commensurate objectives. The process of optimizing systematically and simultaneously over a set of objective functions is known as multi-objective optimization. In multi-objective optimization, we have a (possibly exponentially large) set of decisions and each decision has a set of alternatives. Each alternative depends on the state of the world, and is evaluated with respect to a number of criteria. In this thesis, we consider the decision making problems in two scenarios. In the first scenario, the current state of the world, under which the decisions are to be made, is known in advance. In the second scenario, the current state of the world is unknown at the time of making decisions. For decision making under certainty, we consider the framework of multiobjective constraint optimization and focus on extending the algorithms to solve these models to the case where there are additional trade-offs. We focus especially on branch-and-bound algorithms that use a mini-buckets algorithm for generating the upper bound at each node of the search tree (in the context of maximizing values of objectives). Since the size of the guiding upper bound sets can become very large during the search, we introduce efficient methods for reducing these sets, yet still maintaining the upper bound property. We define a formalism for imprecise trade-offs, which allows the decision maker during the elicitation stage, to specify a preference for one multi-objective utility vector over another, and use such preferences to infer other preferences. The induced preference relation then is used to eliminate the dominated utility vectors during the computation. For testing the dominance between multi-objective utility vectors, we present three different approaches. The first is based on a linear programming approach, the second is by use of distance-based algorithm (which uses a measure of the distance between a point and a convex cone); the third approach makes use of a matrix multiplication, which results in much faster dominance checks with respect to the preference relation induced by the trade-offs. Furthermore, we show that our trade-offs approach, which is based on a preference inference technique, can also be given an alternative semantics based on the well known Multi-Attribute Utility Theory. Our comprehensive experimental results on common multi-objective constraint optimization benchmarks demonstrate that the proposed enhancements allow the algorithms to scale up to much larger problems than before. For decision making problems under uncertainty, we describe multi-objective influence diagrams, based on a set of p objectives, where utility values are vectors in Rp, and are typically only partially ordered. These can be solved by a variable elimination algorithm, leading to a set of maximal values of expected utility. If the Pareto ordering is used this set can often be prohibitively large. We consider approximate representations of the Pareto set based on ϵ-coverings, allowing much larger problems to be solved. In addition, we define a method for incorporating user trade-offs, which also greatly improves the efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe how an evidential-reasoner can be used as a component of risk assessment of engineering projects using a direct way of reasoning. Guan & Bell (1991) introduced this method by using the mass functions to express rule strengths. Mass functions are also used to express data strengths. The data and rule strengths are combined to get a mass distribution for each rule; i.e., the first half of our reasoning process. Then we combine the prior mass and the evidence from the different rules; i.e., the second half of the reasoning process. Finally, belief intervals are calculated to help in identifying the risks. We apply our evidential-reasoner on an engineering project and the results demonstrate the feasibility and applicability of this system in this environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Margins are used in radiotherapy to assist in the calculation of planning target volumes. These margins can be determined by analysing the geometric uncertainties inherent to the radiotherapy planning and delivery process. An important part of this process is the study of electronic portal images collected throughout the course of treatment. Set-up uncertainties were determined for prostate radiotherapy treatments at our previous site and the new purpose-built centre, with margins determined using a number of different methods. In addition, the potential effect of reducing the action level from 5 mm to 3 mm for changing a patient set-up, based on off-line bony anatomy-based portal image analysis, was studied. Margins generated using different methodologies were comparable. It was found that set-up errors were reduced following relocation to the new centre. Although a significant increase in the number of corrections to a patient's set-up was predicted if the action level was reduced from 5 mm to 3 mm, minimal reduction in patient set-up uncertainties would be seen as a consequence. Prescriptive geometric uncertainty analysis not only supports calculation and justification of the margins used clinically to generate planning target volumes, but may also best be used to monitor trends in clinical practice or audit changes introduced by new equipment, technology or practice. Simulations on existing data showed that a 3 mm rather than a 5 mm action level during off-line, bony anatomy-based portal imaging would have had a minimal benefit for the patients studied in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper focuses on the development of an aircraft design optimization methodology that models uncertainty and sensitivity analysis in the tradeoff between manufacturing cost, structural requirements, andaircraft direct operating cost.Specifically,ratherthanonlylooking atmanufacturingcost, direct operatingcost is also consideredintermsof the impact of weight on fuel burn, in addition to the acquisition cost to be borne by the operator. Ultimately, there is a tradeoff between driving design according to minimal weight and driving it according to reduced manufacturing cost. Theanalysis of cost is facilitated withagenetic-causal cost-modeling methodology,andthe structural analysis is driven by numerical expressions of appropriate failure modes that use ESDU International reference data. However, a key contribution of the paper is to investigate the modeling of uncertainty and to perform a sensitivity analysis to investigate the robustness of the optimization methodology. Stochastic distributions are used to characterize manufacturing cost distributions, andMonteCarlo analysis is performed in modeling the impact of uncertainty on the cost modeling. The results are then used in a sensitivity analysis that incorporates the optimization methodology. In addition to investigating manufacturing cost variance, the sensitivity of the optimization to fuel burn cost and structural loading are also investigated. It is found that the consideration of manufacturing cost does make an impact and results in a different optimal design configuration from that delivered by the minimal-weight method. However, it was shown that at lower applied loads there is a threshold fuel burn cost at which the optimization process needs to reduce weight, and this threshold decreases with increasing load. The new optimal solution results in lower direct operating cost with a predicted savings of 640=m2 of fuselage skin over the life, relating to a rough order-of-magnitude direct operating cost savings of $500,000 for the fuselage alone of a small regional jet. Moreover, it was found through the uncertainty analysis that the principle was not sensitive to cost variance, although the margins do change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies a problem of dynamic pricing faced by a retailer with limited inventory, uncertain about the demand rate model, aiming to maximize expected discounted revenue over an infinite time horizon. The retailer doubts his demand model which is generated by historical data and views it as an approximation. Uncertainty in the demand rate model is represented by a notion of generalized relative entropy process, and the robust pricing problem is formulated as a two-player zero-sum stochastic differential game. The pricing policy is obtained through the Hamilton-Jacobi-Isaacs (HJI) equation. The existence and uniqueness of the solution of the HJI equation is shown and a verification theorem is proved to show that the solution of the HJI equation is indeed the value function of the pricing problem. The results are illustrated by an example with exponential nominal demand rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we, therefore, propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decisionmaker’s attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process.