921 resultados para Uncertainty Avoidance
Resumo:
A fuzzy ruled-based system was developed in this study and resulted in an index indicating the level of uncertainty related to commercial transactions between cassava growers and their dealers. The fuzzy system was developed based on Transaction Cost Economics approach. The fuzzy system was developed from input variables regarding information sharing between grower and dealer on “Demand/purchase Forecasting”, “Production Forecasting” and “Production Innovation”. The output variable is the level of uncertainty regarding the transaction between seller and buyer agent, which may serve as a system for detecting inefficiencies. Evidences from 27 cassava growers registered in the Regional Development Offices of Tupa and Assis, São Paulo, Brazil, and 48 of their dealers supported the development of the system. The mathematical model indicated that 55% of the growers present a Very High level of uncertainty, 33% present Medium or High. The others present Low or Very Low level of uncertainty. From the model, simulations of external interferences can be implemented in order to improve the degree of uncertainty and, thus, lower transaction costs.
Resumo:
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Resumo:
Analytical methods accounting for imperfect detection are often used to facilitate reliable inference in population and community ecology. We contend that similar approaches are needed in disease ecology because these complicated systems are inherently difficult to observe without error. For example, wildlife disease studies often designate individuals, populations, or spatial units to states (e.g., susceptible, infected, post-infected), but the uncertainty associated with these state assignments remains largely ignored or unaccounted for. We demonstrate how recent developments incorporating observation error through repeated sampling extend quite naturally to hierarchical spatial models of disease effects, prevalence, and dynamics in natural systems. A highly pathogenic strain of avian influenza virus in migratory waterfowl and a pathogenic fungus recently implicated in the global loss of amphibian biodiversity are used as motivating examples. Both show that relatively simple modifications to study designs can greatly improve our understanding of complex spatio-temporal disease dynamics by rigorously accounting for uncertainty at each level of the hierarchy.
Resumo:
Categorical data cannot be interpolated directly because they are outcomes of discrete random variables. Thus, types of categorical variables are transformed into indicator functions that can be handled by interpolation methods. Interpolated indicator values are then backtransformed to the original types of categorical variables. However, aspects such as variability and uncertainty of interpolated values of categorical data have never been considered. In this paper we show that the interpolation variance can be used to map an uncertainty zone around boundaries between types of categorical variables. Moreover, it is shown that the interpolation variance is a component of the total variance of the categorical variables, as measured by the coefficient of unalikeability. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Obtaining ecotoxicological data on pesticides in tropical regions is imperative for performing more realistic risk analysis, and avoidance tests have been proposed as a useful, fast and cost-effective tool. Therefore, the present study aimed to evaluate the avoidance behavior of Eisenia andrei to a formulated product, Vertimec(A (R)) 18 EC (a.i abamectin), in tests performed on a reference tropical artificial soil (TAS), to derive ecotoxicological data on tropical conditions, and a natural soil (NS), simulating crop field conditions. In TAS tests an adaptation of the substrate recommended by OECD and ISO protocols was used, with residues of coconut fiber as a source of organic matter. Concentrations of the pesticide on TAS test ranged from 0 to 7 mg abamectin/kg (dry weight-d.w.). In NS tests, earthworms were exposed to samples of soils sprayed in situ with: 0.9 L of Vertimec(A (R)) 18 EC/ha (RD); twice as much this dosage (2RD); and distilled water (Control), respectively, and to 2RD: control dilutions (12.5, 25, 50, 75%). All tests were performed under 25 +/- A 2A degrees C, to simulate tropical conditions, and a 12hL:12hD photoperiod. The organisms avoided contaminated TAS for an EC50,48h = 3.918 mg/kg soil d.w., LOEC = 1.75 mg/kg soil d.w. and NOEC = 0.85 mg/kg soil d.w. No significant avoidance response occurred for any NS test. Abamectin concentrations in NS were rather lower than EC50, 48h and LOEC determined in TAS tests. The results obtained contribute to overcome a lack of ecotoxicological data on pesticides under tropical conditions, but more tests with different soil invertebrates are needed to improve pesticides risk analysis.
Resumo:
The major goal of this research was the development and implementation of a control system able to avoid collisions during the flight for a mini-quadrotor helicopter, based only on its embedded sensors without changing the environment. However, it is important to highlight that the design aspects must be seriously considered in order to overcome hardware limitations and achieve control simplification. The controllers of a UAV (Unmanned Aerial Vehicle) robot deal with highly unstable dynamics and strong axes coupling. Furthermore, any additional embedded sensor increases the robot total weight and therefore, decreases its operating time. The best balance between embedded electronics and robot operating time is desired. This paper focuses not only on the development and implementation of a collision avoidance controller for a mini-robotic helicopter using only its embedded sensors, but also on the mathematical model that was essential for the controller developing phases. Based on this model we carried out the development of a simulation tool based on MatLab/Simulink that was fundamental for setting the controllers' parameters. This tool allowed us to simulate and improve the OS4 controllers in different modeled environments and test different approaches. After that, the controllers were embedded in the real robot and the results proved to be very robust and feasible. In addition to this, the controller has the advantage of being compatible with future path planners that we are developing.
Resumo:
Large areas of Amazonian evergreen forest experience seasonal droughts extending for three or more months, yet show maximum rates of photosynthesis and evapotranspiration during dry intervals. This apparent resilience is belied by disproportionate mortality of the large trees in manipulations that reduce wet season rainfall, occurring after 2-3 years of treatment. The goal of this study is to characterize the mechanisms that produce these contrasting ecosystem responses. A mechanistic model is developed based on the ecohydrological framework of TIN (Triangulated Irregular Network)-based Real Time Integrated Basin Simulator + Vegetation Generator for Interactive Evolution (tRIBS+VEGGIE). The model is used to test the roles of deep roots and soil capillary flux to provide water to the forest during the dry season. Also examined is the importance of "root niche separation," in which roots of overstory trees extend to depth, where during the dry season they use water stored from wet season precipitation, while roots of understory trees are concentrated in shallow layers that access dry season precipitation directly. Observational data from the Tapajo's National Forest, Brazil, were used as meteorological forcing and provided comprehensive observational constraints on the model. Results strongly suggest that deep roots with root niche separation adaptations explain both the observed resilience during seasonal drought and the vulnerability of canopy-dominant trees to extended deficits of wet season rainfall. These mechanisms appear to provide an adaptive strategy that enhances productivity of the largest trees in the face of their disproportionate heat loads and water demand in the dry season. A sensitivity analysis exploring how wet season rainfall affects the stability of the rainforest system is presented. Citation: Ivanov, V. Y., L. R. Hutyra, S. C. Wofsy, J. W. Munger, S. R. Saleska, R. C. de Oliveira Jr., and P. B. de Camargo (2012), Root niche separation can explain avoidance of seasonal drought stress and vulnerability of overstory trees to extended drought in a mature Amazonian forest, Water Resour. Res., 48, W12507, doi:10.1029/2012WR011972.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Programa de Doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería
Resumo:
In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.
Resumo:
In this thesis we address a collection of Network Design problems which are strongly motivated by applications from Telecommunications, Logistics and Bioinformatics. In most cases we justify the need of taking into account uncertainty in some of the problem parameters, and different Robust optimization models are used to hedge against it. Mixed integer linear programming formulations along with sophisticated algorithmic frameworks are designed, implemented and rigorously assessed for the majority of the studied problems. The obtained results yield the following observations: (i) relevant real problems can be effectively represented as (discrete) optimization problems within the framework of network design; (ii) uncertainty can be appropriately incorporated into the decision process if a suitable robust optimization model is considered; (iii) optimal, or nearly optimal, solutions can be obtained for large instances if a tailored algorithm, that exploits the structure of the problem, is designed; (iv) a systematic and rigorous experimental analysis allows to understand both, the characteristics of the obtained (robust) solutions and the behavior of the proposed algorithm.