989 resultados para uncertainty estimation
Resumo:
Resumo:
Aerodynamic balances are employed in wind tunnels to estimate the forces and moments acting on the model under test. This paper proposes a methodology for the assessment of uncertainty in the calibration of an internal multi-component aerodynamic balance. In order to obtain a suitable model to provide aerodynamic loads from the balance sensor responses, a calibration is performed prior to the tests by applying known weights to the balance. A multivariate polynomial fitting by the least squares method is used to interpolate the calibration data points. The uncertainties of both the applied loads and the readings of the sensors are considered in the regression. The data reduction includes the estimation of the calibration coefficients, the predicted values of the load components and their corresponding uncertainties, as well as the goodness of fit.
Resumo:
Both in industry and research, the quality control of micrometric manufactured parts is based on the measurement of parameters whose traceability is sometimes difficult to guarantee. In some of these parts, the confocal microscopy shows great aptitudes to characterize a measurand qualitatively and quantitatively. The confocal microscopy allows the acquisition of 2D and 3D images that are easily manipulated. Nowadays, this equipment is manufactured by many different brands, each of them claiming a resolution probably not in accord to their real performance. The Laser Center (Technical University of Madrid) has a confocal microscope to verify the dimensions of the micro mechanizing in their own research projects. The present study pretends to confirm that the magnitudes obtained are true and reliable. To achieve this, a methodology for confocal microscope calibration is proposed, as well as an experimental phase for dimensionally valuing the equipment by 4 different standard positions, with its seven magnifications and the six objective lenses that the equipment currently has, in the x–y and z axis. From the results the uncertainty will be estimated along with an effect analysis of the different magnifications in each of the objective lenses.
Resumo:
Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.
Resumo:
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the “with measures” scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. Implications: A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
Resumo:
The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.
Resumo:
A method to quantify lycopene and β-carotene in freeze dried tomato pulp by high performance liquid chromatography (HLPC) was validated according to the criteria of selectivity, sensitivity, precision and accuracy, and uncertainty estimation of measurement was determined with data obtained in the validation. The validated method presented is selective in terms of analysis, and it had a good precision and accuracy. Detection limit for lycopene and β-carotene was 4.2 and 0.23 mg 100 g-1, respectively. The estimation of expanded uncertainty (K = 2) for lycopene was 104 ± 21 mg 100 g-1 and for β-carotene was 6.4 ± 1.5 mg 100 g-1.
Resumo:
The DSSAT/CANEGRO model was parameterized and its predictions evaluated using data from five sugarcane (Sacchetrum spp.) experiments conducted in southern Brazil. The data used are from two of the most important Brazilian cultivars. Some parameters whose values were either directly measured or considered to be well known were not adjusted. Ten of the 20 parameters were optimized using a Generalized Likelihood Uncertainty Estimation (GLUE) algorithm using the leave-one-out cross-validation technique. Model predictions were evaluated using measured data of leaf area index (LA!), stalk and aerial dry mass, sucrose content, and soil water content, using bias, root mean squared error (RMSE), modeling efficiency (Eff), correlation coefficient, and agreement index. The Decision Support System for Agrotechnology Transfer (DSSAT)/CANEGRO model simulated the sugarcane crop in southern Brazil well, using the parameterization reported here. The soil water content predictions were better for rainfed (mean RMSE = 0.122mm) than for irrigated treatment (mean RMSE = 0.214mm). Predictions were best for aerial dry mass (Eff = 0.850), followed by stalk dry mass (Eff = 0.765) and then sucrose mass (Eff = 0.170). Number of green leaves showed the worst fit (Eff = -2.300). The cross-validation technique permits using multiple datasets that would have limited use if used independently because of the heterogeneity of measures and measurement strategies.
Resumo:
O objetivo deste trabalho foi parametrizar e avaliar o modelo DSSAT/Canegro para cinco variedades brasileiras de cana-de-açúcar. A parametrização foi realizada a partir do uso de dados biométricos e de crescimento das variedades CTC 4, CTC 7, CTC 20, RB 86-7515 e RB 83-5486, obtidos em cinco localidades brasileiras. Foi realizada análise de sensibilidade local para os principais parâmetros. A parametrização do modelo foi feita por meio da técnica de estimativa da incerteza de probabilidade generalizada ("generalized likelihood uncertainty estimation", Glue). Para a avaliação das predições, foram utilizados, como indicadores estatísticos, o coeficiente de determinação (R²), o índice D de Willmott e a raiz quadrada do erro-médio (RMSE). As variedades CTC apresentaram índice D entre 0,870 e 0,944, para índice de área foliar, altura de colmo, perfilhamento e teor de sacarose. A variedade RB 83-5486 apresentou resultados similares para teor de sacarose e massa de matéria fresca do colmo, enquanto a variedade RB 86-7515 apresentou valores entre 0,665 e 0,873, para as variáveis avaliadas.
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
The implementation of a quality assurance program in chemical analytical laboratories, that can aid in demonstrate the quality of their results, is an issue of great concern. As a consequence, it is mandatory to give an estimate of the confidence that can be placed on the obtained results. An useful measure of this confidence is the measurement uncertainty and, nowadays, a result without the corresponding uncertainty statement cannot be considered reliable. This paper presents a summary of the most important mechanisms for the evaluation and reporting of the measurement uncertainty. In implementing these principles, it is described the measurement uncertainty estimation associated with the preparation of a uranium elemental reference solution at 2.4 mg.kg-1 from the corresponding certified reference material (in this example at 1003 mg.kg-1).
Resumo:
The measurement uncertainty is useful to estimate the confidence of analytical results. Nowadays, a result without the uncertainty statement cannot be considered reliable, but the scientific literature still lacks examples of the estimate of the measurement uncertainty. This paper presents a practical and reliable description of the measurement uncertainty estimation of the analytical determination of ethyl carbamate in cachaça by GC-IDMS. The isotope dilution technique (ID) associated with GC-MS was used to improve the accuracy. The uncertainty estimated corresponds to 10% of the mass fraction of ethyl carbamate (115 ± 11) ng/g, which is in agreement with ppb level.
Resumo:
A method to quantify lycopene and β-carotene in freeze dried tomato pulp by high performance liquid chromatography (HLPC) was validated according to the criteria of selectivity, sensitivity, precision and accuracy, and uncertainty estimation of measurement was determined with data obtained in the validation. The validated method presented is selective in terms of analysis, and it had a good precision and accuracy. Detection limit for lycopene and β-carotene was 4.2 and 0.23 mg 100 g-1, respectively. The estimation of expanded uncertainty (K = 2) for lycopene was 104 ± 21 mg 100 g-1 and for β-carotene was 6.4 ± 1.5 mg 100 g-1.
Resumo:
The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The aim of this study was, within a sensitivity analysis framework, to determine if additional model complexity gives a better capability to model the hydrology and nitrogen dynamics of a small Mediterranean forested catchment or if the additional parameters cause over-fitting. Three nitrogen-models of varying hydrological complexity were considered. For each model, general sensitivity analysis (GSA) and Generalized Likelihood Uncertainty Estimation (GLUE) were applied, each based on 100,000 Monte Carlo simulations. The results highlighted the most complex structure as the most appropriate, providing the best representation of the non-linear patterns observed in the flow and streamwater nitrate concentrations between 1999 and 2002. Its 5% and 95% GLUE bounds, obtained considering a multi-objective approach, provide the narrowest band for streamwater nitrogen, which suggests increased model robustness, though all models exhibit periods of inconsistent good and poor fits between simulated outcomes and observed data. The results confirm the importance of the riparian zone in controlling the short-term (daily) streamwater nitrogen dynamics in this catchment but not the overall flux of nitrogen from the catchment. It was also shown that as the complexity of a hydrological model increases over-parameterisation occurs, but the converse is true for a water quality model where additional process representation leads to additional acceptable model simulations. Water quality data help constrain the hydrological representation in process-based models. Increased complexity was justifiable for modelling river-system hydrochemistry. Increased complexity was justifiable for modelling river-system hydrochemistry.