940 resultados para Calibration uncertainty
Resumo:
A fuzzy ruled-based system was developed in this study and resulted in an index indicating the level of uncertainty related to commercial transactions between cassava growers and their dealers. The fuzzy system was developed based on Transaction Cost Economics approach. The fuzzy system was developed from input variables regarding information sharing between grower and dealer on “Demand/purchase Forecasting”, “Production Forecasting” and “Production Innovation”. The output variable is the level of uncertainty regarding the transaction between seller and buyer agent, which may serve as a system for detecting inefficiencies. Evidences from 27 cassava growers registered in the Regional Development Offices of Tupa and Assis, São Paulo, Brazil, and 48 of their dealers supported the development of the system. The mathematical model indicated that 55% of the growers present a Very High level of uncertainty, 33% present Medium or High. The others present Low or Very Low level of uncertainty. From the model, simulations of external interferences can be implemented in order to improve the degree of uncertainty and, thus, lower transaction costs.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.
Resumo:
Analytical methods accounting for imperfect detection are often used to facilitate reliable inference in population and community ecology. We contend that similar approaches are needed in disease ecology because these complicated systems are inherently difficult to observe without error. For example, wildlife disease studies often designate individuals, populations, or spatial units to states (e.g., susceptible, infected, post-infected), but the uncertainty associated with these state assignments remains largely ignored or unaccounted for. We demonstrate how recent developments incorporating observation error through repeated sampling extend quite naturally to hierarchical spatial models of disease effects, prevalence, and dynamics in natural systems. A highly pathogenic strain of avian influenza virus in migratory waterfowl and a pathogenic fungus recently implicated in the global loss of amphibian biodiversity are used as motivating examples. Both show that relatively simple modifications to study designs can greatly improve our understanding of complex spatio-temporal disease dynamics by rigorously accounting for uncertainty at each level of the hierarchy.
Resumo:
Categorical data cannot be interpolated directly because they are outcomes of discrete random variables. Thus, types of categorical variables are transformed into indicator functions that can be handled by interpolation methods. Interpolated indicator values are then backtransformed to the original types of categorical variables. However, aspects such as variability and uncertainty of interpolated values of categorical data have never been considered. In this paper we show that the interpolation variance can be used to map an uncertainty zone around boundaries between types of categorical variables. Moreover, it is shown that the interpolation variance is a component of the total variance of the categorical variables, as measured by the coefficient of unalikeability. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Objective: To compare two methods of respiratory inductive plethysmography (RIP) calibration in three different positions. Methods: We evaluated 28 healthy subjects (18 women and 10 men), with a mean age of 25.4 +/- 3.9 years. For all of the subjects, isovolume maneuver calibration (ISOCAL) and qualitative diagnostic calibration (QDC) were used in the orthostatic, sitting, and supine positions. In order to evaluate the concordance between the two calibration methods, we used ANOVA and Bland-Altman plots. Results: The values of the constant of proportionality (X) were significantly different between ISOCAL and QDC in the three positions evaluated: 1.6 +/- 0.5 vs. 2.0 +/- 1.2, in the supine position, 2.5 +/- 0.8 vs. 0.6 +/- 0.3 in the sitting position, and 2.0 +/- 0.8 vs. 0.6 +/- 0.3 in the orthostatic position (p < 0.05 for all). Conclusions: Our results suggest that QDC is an inaccurate method for the calibration of RIP. The K values obtained with ISOCAL reveal that RIP should be calibrated for each position evaluated.
Resumo:
The leaf area index (LAI) is a key characteristic of forest ecosystems. Estimations of LAI from satellite images generally rely on spectral vegetation indices (SVIs) or radiative transfer model (RTM) inversions. We have developed a new and precise method suitable for practical application, consisting of building a species-specific SVI that is best-suited to both sensor and vegetation characteristics. Such an SVI requires calibration on a large number of representative vegetation conditions. We developed a two-step approach: (1) estimation of LAI on a subset of satellite data through RTM inversion; and (2) the calibration of a vegetation index on these estimated LAI. We applied this methodology to Eucalyptus plantations which have highly variable LAI in time and space. Previous results showed that an RTM inversion of Moderate Resolution Imaging Spectroradiometer (MODIS) near-infrared and red reflectance allowed good retrieval performance (R-2 = 0.80, RMSE = 0.41), but was computationally difficult. Here, the RTM results were used to calibrate a dedicated vegetation index (called "EucVI") which gave similar LAI retrieval results but in a simpler way. The R-2 of the regression between measured and EucVI-simulated LAI values on a validation dataset was 0.68, and the RMSE was 0.49. The additional use of stand age and day of year in the SVI equation slightly increased the performance of the index (R-2 = 0.77 and RMSE = 0.41). This simple index opens the way to an easily applicable retrieval of Eucalyptus LAI from MODIS data, which could be used in an operational way.
Resumo:
In this paper, the effects of uncertainty and expected costs of failure on optimum structural design are investigated, by comparing three distinct formulations of structural optimization problems. Deterministic Design Optimization (DDO) allows one the find the shape or configuration of a structure that is optimum in terms of mechanics, but the formulation grossly neglects parameter uncertainty and its effects on structural safety. Reliability-based Design Optimization (RBDO) has emerged as an alternative to properly model the safety-under-uncertainty part of the problem. With RBDO, one can ensure that a minimum (and measurable) level of safety is achieved by the optimum structure. However, results are dependent on the failure probabilities used as constraints in the analysis. Risk optimization (RO) increases the scope of the problem by addressing the compromising goals of economy and safety. This is accomplished by quantifying the monetary consequences of failure, as well as the costs associated with construction, operation and maintenance. RO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when optimum safety coefficients are used as constraints in DDO, the formulation leads to configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected costs of failure). When (optimum) system failure probability is used as a constraint in RBDO, this solution also reduces manufacturing costs but by increasing total expected costs. This happens when the costs associated with different failure modes are distinct. Hence, a general equivalence between the formulations cannot be established. Optimum structural design considering expected costs of failure cannot be controlled solely by safety factors nor by failure probability constraints, but will depend on actual structural configuration. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The continental margin of southeast Brazil is elevated. Onshore Tertiary basins and Late Cretaceous/Paleogene intrusions are good evidence for post breakup tectono-magmatic activity. To constrain the impact of post-rift reactivation on the geological history of the area, we carried out a new thermochronological study. Apatite fission track ages range from 60.7 +/- 1.9 Ma to 129.3 +/- 4.3 Ma, mean track lengths from 11.41 +/- 0.23 mu m to 14.31 +/- 0.24 mu m and a subset of the (U-Th)/He ages range from 45.1 +/- 1.5 to 122.4 +/- 2.5 Ma. Results of inverse thermal history modeling generally support the conclusions from an earlier study for a Late Cretaceous phase of cooling. Around the onshore Taubate Basin, for a limited number of samples, the first detectable period of cooling occurred during the Early Tertiary. The inferred thermal histories for many samples also imply subsequent reheating followed by Neogene cooling. Given the uncertainty of the inversion results, we did deterministic forward modeling to assess the range of possibilities of this Tertiary part of the thermal history. The evidence for reheating seems to be robust around the Taubate Basin, but elsewhere the data cannot discriminate between this and a less complex thermal history. However, forward modeling results and geological information support the conclusion that the whole area underwent cooling during the Neogene. The synchronicity of the cooling phases with Andean tectonics and those in NE Brazil leads us to assume a plate-wide compressional stress that reactivated inherited structures. The present-day topographic relief of the margin reflects a contribution from post-breakup reactivation and uplift.
Resumo:
O objetivo deste trabalho foi parametrizar e avaliar o modelo DSSAT/Canegro para cinco variedades brasileiras de cana-de-açúcar. A parametrização foi realizada a partir do uso de dados biométricos e de crescimento das variedades CTC 4, CTC 7, CTC 20, RB 86-7515 e RB 83-5486, obtidos em cinco localidades brasileiras. Foi realizada análise de sensibilidade local para os principais parâmetros. A parametrização do modelo foi feita por meio da técnica de estimativa da incerteza de probabilidade generalizada ("generalized likelihood uncertainty estimation", Glue). Para a avaliação das predições, foram utilizados, como indicadores estatísticos, o coeficiente de determinação (R2), o índice D de Willmott e a raiz quadrada do erro-médio (RMSE). As variedades CTC apresentaram índice D entre 0,870 e 0,944, para índice de área foliar, altura de colmo, perfilhamento e teor de sacarose. A variedade RB 83-5486 apresentou resultados similares para teor de sacarose e massa de matéria fresca do colmo, enquanto a variedade RB 86-7515 apresentou valores entre 0,665 e 0,873, para as variáveis avaliadas.
Resumo:
Programa de Doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería
Resumo:
[EN]This work presents the calibration and validation of an air quality finite element model applied to emissions from a thermal power plant located in Gran Canaria. The calibration is performed using genetic algorithms. To calibrate and validate the model, the authors use empirical measures of pollutants concentrations from 4 stations located nearby the power plant; an hourly record per station during 3 days is available. Measures from 3 stations will be used to calibrate, while validation will use measures from the remaining station…