971 resultados para Process uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprises need continuous product development activities to remain competitive in the marketplace. Their product development process (PDP) must manage stakeholders' needs - technical, financial, legal, and environmental aspects, customer requirements, Corporate strategy, etc. -, being a multidisciplinary and strategic issue. An approach to use real option to support the decision-making process at PDP phases in taken. The real option valuation method is often presented as an alternative to the conventional net present value (NPV) approach. It is based on the same principals of financial options: the right to buy or sell financial values (mostly stocks) at a predetermined price, with no obligation to do so. In PDP, a multi-period approach that takes into account the flexibility of, for instance, being able to postpone prototyping and design decisions, waiting for more information about technologies, customer acceptance, funding, etc. In the present article, the state of the art of real options theory is prospected and a model to use the real options in PDP is proposed, so that financial aspects can be properly considered at each project phase of the product development. Conclusion is that such model can provide more robustness to the decisions processes within PDP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the recovery in intraregional trade over the past three years, intra-group trade, that is trade within the Southern Common Market (MERCOSUR), the Andean Community (CAN) and the Central American Common Market (CACM), remains much weaker than that observed within similar groups in other regions of the world. This weakness is due essentially to the serious lack of complementarity in the process of eliminating tariff barriers (see chapter 3 of Latin America and the Caribbean in the World Economy 2004: Trends 2005, and the study on regional integration entitled: "América Latina y El Caribe: La integración regional en la hora de las definiciones", which is due to be published shortly and which updates basic information for the year 2005). The reasons include (a) weak institutional capacities; (b) the lack of macroeconomic coordination; (c) inadequate infrastructure and d) the lack of depth in integration-related trade disciplines.  This edition of the Bulletin reviews the mechanisms for dispute settlement within Mercosur, the Andean Community and CACM with a view to drawing conclusions on the extent to which they are used. In order to reform such mechanisms, consideration should be given to the creation of a single dispute settlement mechanism which would replicate the procedures and regulations of the World Trade Organization (WTO).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a Fuzzy Goal Programming model (FGP) for a real aggregate production-planning problem. To do so, an application was made in a Brazilian Sugar and Ethanol Milling Company. The FGP Model depicts the comprehensive production process of sugar, ethanol, molasses and derivatives, and considers the uncertainties involved in ethanol and sugar production. Decision-makings, related to the agricultural and logistics phases, were considered on a weekly-basis planning horizon to include the whole harvesting season and the periods between harvests. The research has provided interesting results about decisions in the agricultural stages of cutting, loading and transportation to sugarcane suppliers and, especially, in milling decisions, whose choice of production process includes storage and logistics distribution. (C)2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we address a collection of Network Design problems which are strongly motivated by applications from Telecommunications, Logistics and Bioinformatics. In most cases we justify the need of taking into account uncertainty in some of the problem parameters, and different Robust optimization models are used to hedge against it. Mixed integer linear programming formulations along with sophisticated algorithmic frameworks are designed, implemented and rigorously assessed for the majority of the studied problems. The obtained results yield the following observations: (i) relevant real problems can be effectively represented as (discrete) optimization problems within the framework of network design; (ii) uncertainty can be appropriately incorporated into the decision process if a suitable robust optimization model is considered; (iii) optimal, or nearly optimal, solutions can be obtained for large instances if a tailored algorithm, that exploits the structure of the problem, is designed; (iv) a systematic and rigorous experimental analysis allows to understand both, the characteristics of the obtained (robust) solutions and the behavior of the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods based on Kriging have recently been proposed for calculating a probability of failure involving costly-to-evaluate functions. A closely related problem is to estimate the set of inputs leading to a response exceeding a given threshold. Now, estimating such a level set—and not solely its volume—and quantifying uncertainties on it are not straightforward. Here we use notions from random set theory to obtain an estimate of the level set, together with a quantification of estimation uncertainty. We give explicit formulae in the Gaussian process set-up and provide a consistency result. We then illustrate how space-filling versus adaptive design strategies may sequentially reduce level set estimation uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stepwise uncertainty reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function  f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this article we introduce several multipoint sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code, and data are available online as supplementary materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper addresses the question of which factors drive the formation of policy preferences when there are remaining uncertainties about the causes and effects of the problem at stake. To answer this question we examine policy preferences reducing aquatic micropollutants, a specific case of water protection policy and different actor groups (e.g. state, science, target groups). Here, we contrast two types of policy preferences: a) preventive or source-directed policies, which mitigate pollution in order to avoid contact with water; and b) reactive or end-of-pipe policies, which filter water already contaminated by pollutants. In a second step, we analyze the drivers for actors’ policy preferences by focusing on three sets of explanations, i.e. participation, affectedness and international collaborations. The analysis of our survey data, qualitative interviews and regression analysis of the Swiss political elite show that participation in the policy-making process leads to knowledge exchange and reduces uncertainties about the policy problem, which promotes preferences for preventive policies. Likewise, actors who are affected by the consequences of micropollutants, such as consumer or environmental associations, opt for anticipatory policies. Interestingly, we find that uncertainties about the effectiveness of preventive policies can promote preferences for end-of-pipe policies. While preventive measures often rely on (uncertain) behavioral changes of target groups, reactive policies are more reliable when it comes to fulfilling defined policy goals. Finally, we find that in a transboundary water management context, actors with international collaborations prefer policies that produce immediate and reliable outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applying Theoretical Constructs to Address Medical Uncertainty Situations involving medical reasoning usually include some level of medical uncertainty. Despite the identification of shared decision-making (SDM) as an effective technique, it has been observed that the likelihood of physicians and patients engaging in shared decision making is lower in those situations where it is most needed; specifically in circumstances of medical uncertainty. Having identified shared decision making as an effective, yet often a neglected approach to resolving a lack of information exchange in situations involving medical uncertainty, the next step is to determine the way(s) in which SDM can be integrated and the supplemental processes that may facilitate its integration. SDM involves unique types of communication and relationships between patients and physicians. Therefore, it is necessary to further understand and incorporate human behavioral elements - in particular, behavioral intent - in order to successfully identify and realize the potential benefits of SDM. This paper discusses the background and potential interaction between the theories of shared decision-making, medical uncertainty, and behavioral intent. Identifying Shared Decision-Making Elements in Medical Encounters Dealing with Uncertainty A recent summary of the state of medical knowledge in the U.S. reported that nearly half (47%) of all treatments were of unknown effectiveness, and an additional 7% involved an uncertain tradeoff between benefits and harms. Shared decision-making (SDM) was identified as an effective technique for managing uncertainty when two or more parties were involved. In order to understand which of the elements of SDM are used most frequently and effectively, it is necessary to identify these key elements, and understand how these elements related to each other and the SDM process. The elements identified through the course of the present research were selected from basic principles of the SDM model and the “Data, Information, Knowledge, Wisdom” (DIKW) Hierarchy. The goal of this ethnographic research was to identify which common elements of shared decision-making patients are most often observed applying in the medical encounter. The results of the present study facilitated the understanding of which elements patients were more likely to exhibit during a primary care medical encounter, as well as determining variables of interest leading to more successful shared decision-making practices between patients and their physicians. Understanding Behavioral Intent to Participate in Shared Decision-Making in Medically Uncertain Situations Objective: This article describes the process undertaken to identify and validate behavioral and normative beliefs and behavioral intent of men between the ages of 45-70 with regard to participating in shared decision-making in medically uncertain situations. This article also discusses the preliminary results of the aforementioned processes and explores potential future uses of this information which may facilitate greater understanding, efficiency and effectiveness of doctor-patient consultations.Design: Qualitative Study using deductive content analysisSetting: Individual semi-structure patient interviews were conducted until data saturation was reached. Researchers read the transcripts and developed a list of codes.Subjects: 25 subjects drawn from the Philadelphia community.Measurements: Qualitative indicators were developed to measure respondents’ experiences and beliefs related to behavioral intent to participate in shared decision-making during medical uncertainty. Subjects were also asked to complete the Krantz Health Opinion Survey as a method of triangulation.Results: Several factors were repeatedly described by respondents as being essential to participate in shared decision-making in medical uncertainty. These factors included past experience with medical uncertainty, an individual’s personality, and the relationship between the patient and his physician.Conclusions: The findings of this study led to the development of a category framework that helped understand an individual’s needs and motivational factors in their intent to participate in shared decision-making. The three main categories include 1) an individual’s representation of medically uncertainty, 2) how the individual copes with medical uncertainty, and 3) the individual’s behavioral intent to seek information and participate in shared decision-making during times of medically uncertain situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a MAS architecture for network diagnosis under uncertainty. Network diagnosis is divided into two inference processes: hypothesis generation and hypothesis confirmation. The first process is distributed among several agents based on a MSBN, while the second one is carried out by agents using semantic reasoning. A diagnosis ontology has been defined in order to combine both inference processes. To drive the deliberation process, dynamic data about the influence of observations are taken during diagnosis process. In order to achieve quick and reliable diagnoses, this influence is used to choose the best action to perform. This approach has been evaluated in a P2P video streaming scenario. Computational and time improvements are highlight as conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The verification of compliance with a design specification in manufacturing requires the use of metrological instruments to check if the magnitude associated with the design specification is or not according with tolerance range. Such instrumentation and their use during the measurement process, has associated an uncertainty of measurement whose value must be related to the value of tolerance tested. Most papers dealing jointly tolerance and measurement uncertainties are mainly focused on the establishment of a relationship uncertainty-tolerance without paying much attention to the impact from the standpoint of process cost. This paper analyzes the cost-measurement uncertainty, considering uncertainty as a productive factor in the process outcome. This is done starting from a cost-tolerance model associated with the process. By means of this model the existence of a measurement uncertainty is calculated in quantitative terms of cost and its impact on the process is analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When users face a certain problem needing a product, service, or action to solve it, selecting the best alternative among them can be a dicult task due to the uncertainty of their quality. This is especially the case in the domains where users do not have an expertise, like for example in Software Engineering. Multiple criteria decision making (MCDM) methods are methods that help making better decisions when facing the complex problem of selecting the best solution among a group of alternatives that can be compared according to different conflicting criteria. In MCDM problems, alternatives represent concrete products, services or actions that will help in achieving a goal, while criteria represent the characteristics of these alternatives that are important for making a decision.