974 resultados para total uncertainty measurement
Resumo:
Soils are an important component in the biogeochemical cycle of carbon, storing about four times more carbon than biomass plants and nearly three times more than the atmosphere. Moreover, the carbon content is directly related on the capacity of water retention, fertility. among other properties. Thus, soil carbon quantification in field conditions is an important challenge related to carbon cycle and global climatic changes. Nowadays. Laser Induced Breakdown Spectroscopy (LIBS) can be used for qualitative elemental analyses without previous treatment of samples and the results are obtained quickly. New optical technologies made possible the portable LIBS systems and now, the great expectation is the development of methods that make possible quantitative measurements with LIBS. The goal of this work is to calibrate a portable LIBS system to carry out quantitative measures of carbon in whole tropical soil sample. For this, six samples from the Brazilian Cerrado region (Argisoil) were used. Tropical soils have large amounts of iron in their compositions, so the carbon line at 247.86 nm presents strong interference of this element (iron lines at 247.86 and 247.95). For this reason, in this work the carbon line at 193.03 nm was used. Using methods of statistical analysis as a simple linear regression, multivariate linear regression and cross-validation were possible to obtain correlation coefficients higher than 0.91. These results show the great potential of using portable LIBS systems for quantitative carbon measurements in tropical soils. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
To check the effectiveness of campaigns preventing drug abuse or indicating local effects of efforts against drug trafficking, it is beneficial to know consumed amounts of substances in a high spatial and temporal resolution. The analysis of drugs of abuse in wastewater (WW) has the potential to provide this information. In this study, the reliability of WW drug consumption estimates is assessed and a novel method presented to calculate the total uncertainty in observed WW cocaine (COC) and benzoylecgonine (BE) loads. Specifically, uncertainties resulting from discharge measurements, chemical analysis and the applied sampling scheme were addressed and three approaches presented. These consist of (i) a generic model-based procedure to investigate the influence of the sampling scheme on the uncertainty of observed or expected drug loads, (ii) a comparative analysis of two analytical methods (high performance liquid chromatography-tandem mass spectrometry and gas chromatography-mass spectrometry), including an extended cross-validation by influent profiling over several days, and (iii) monitoring COC and BE concentrations in WW of the largest Swiss sewage treatment plants. In addition, the COC and BE loads observed in the sewage treatment plant of the city of Berne were used to back-calculate the COC consumption. The estimated mean daily consumed amount was 107 ± 21 g of pure COC, corresponding to 321 g of street-grade COC.
Resumo:
A portable Fourier transform spectrometer (FTS), model EM27/SUN, was deployed onboard the research vessel Polarstern to measure the column-average dry air mole fractions of carbon dioxide (XCO2) and methane (XCH4) by means of direct sunlight absorption spectrometry. We report on technical developments as well as data calibration and reduction measures required to achieve the targeted accuracy of fractions of a percent in retrieved XCO2 and XCH4 while operating the instrument under field conditions onboard the moving platform during a 6-week cruise on the Atlantic from Cape Town (South Africa, 34° S, 18° E; 5 March 2014) to Bremerhaven (Germany, 54° N, 19° E; 14 April 2014). We demonstrate that our solar tracker typically achieved a tracking precision of better than 0.05° toward the center of the sun throughout the ship cruise which facilitates accurate XCO2 and XCH4 retrievals even under harsh ambient wind conditions. We define several quality filters that screen spectra, e.g., when the field of view was partially obstructed by ship structures or when the lines-of-sight crossed the ship exhaust plume. The measurements in clean oceanic air, can be used to characterize a spurious air-mass dependency. After the campaign, deployment of the spectrometer alongside the TCCON (Total Carbon Column Observing Network) instrument at Karlsruhe, Germany, allowed for determining a calibration factor that makes the entire campaign record traceable to World Meteorological Organization (WMO) standards. Comparisons to observations of the GOSAT satellite and concentration fields modeled by the European Centre for Medium-Range Weather Forecasts (ECMWF) Copernicus Atmosphere Monitoring Service (CAMS) demonstrate that the observational setup is well suited to provide validation opportunities above the ocean and along interhemispheric transects.
Resumo:
High precision measurements of the differential cross sections for pi(0) photoproduction at forward angles for two nuclei, (12)C and (208)Pb, have been performed for incident photon energies of 4.9-5.5 GeV to extract the pi(0) -> gamma gamma decay width. The experiment was done at Jefferson Lab using the Hall B photon tagger and a high-resolution multichannel calorimeter. The pi(0) -> gamma gamma decay width was extracted by fitting the measured cross sections using recently updated theoretical models for the process. The resulting value for the decay width is Gamma(pi(0) -> gamma gamma) = 7.82 +/- 0.14(stat) +/- 0.17(syst) eV. With the 2.8% total uncertainty, this result is a factor of 2.5 more precise than the current Particle Data Group average of this fundamental quantity, and it is consistent with current theoretical predictions.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica
Resumo:
Calculation of uncertainty of results represents the new paradigm in the area of the quality of measurements in laboratories. The guidance on the Expression of Uncertainty in Measurement of the ISO / International Organization for Standardization assumes that the analyst is being asked to give a parameter that characterizes the range of the values that could reasonably be associated with the result of the measurement. In practice, the uncertainty of the analytical result may arise from many possible sources: sampling, sample preparation, matrix effects, equipments, standards and reference materials, among others. This paper suggests a procedure for calculation of uncertainties components of an analytical result due to sample preparation (uncertainty of weights and volumetric equipment) and instrument analytical signal (calibration uncertainty). A numerical example is carefully explained based on measurements obtained for cadmium determination by flame atomic absorption spectrophotometry. Results obtained for components of total uncertainty showed that the main contribution to the analytical result was the calibration procedure.
Resumo:
We establish a methodology for calculating uncertainties in sea surface temperature estimates from coefficient based satellite retrievals. The uncertainty estimates are derived independently of in-situ data. This enables validation of both the retrieved SSTs and their uncertainty estimate using in-situ data records. The total uncertainty budget is comprised of a number of components, arising from uncorrelated (eg. noise), locally systematic (eg. atmospheric), large scale systematic and sampling effects (for gridded products). The importance of distinguishing these components arises in propagating uncertainty across spatio-temporal scales. We apply the method to SST data retrieved from the Advanced Along Track Scanning Radiometer (AATSR) and validate the results for two different SST retrieval algorithms, both at a per pixel level and for gridded data. We find good agreement between our estimated uncertainties and validation data. This approach to calculating uncertainties in SST retrievals has a wider application to data from other instruments and retrieval of other geophysical variables.
Resumo:
Recent works (Evelpidou et al., 2012) suggest that the modern tidal notch is disappearing worldwide due sea level rise over the last century. In order to assess this hypothesis, we measured modern tidal notches in several of sites along the Mediterranean coasts. We report observations on tidal notches cut along carbonate coasts from 73 sites from Italy, France, Croatia, Montenegro, Greece, Malta and Spain, plus additional observations carried outside the Mediterranean. At each site, we measured notch width and depth, and we described the characteristics of the biological rim at the base of the notch. We correlated these parameters with wave energy, tide gauge datasets and rock lithology. Our results suggest that, considering 'the development of tidal notches the consequence of midlittoral bioerosion' (as done in Evelpidou et al., 2012) is a simplification that can lead to misleading results, such as stating that notches are disappearing. Important roles in notch formation can be also played by wave action, rate of karst dissolution, salt weathering and wetting and drying cycles. Of course notch formation can be augmented and favoured also by bioerosion which can, in particular cases, be the main process of notch formation and development. Our dataset shows that notches are carved by an ensemble rather than by a single process, both today and in the past, and that it is difficult, if not impossible, to disentangle them and establish which one is prevailing. We therefore show that tidal notches are still forming, challenging the hypothesis that sea level rise has drowned them.
Resumo:
The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.
Resumo:
In this paper we introduce a formation control loop that maximizes the performance of the cooperative perception of a tracked target by a team of mobile robots, while maintaining the team in formation, with a dynamically adjustable geometry which is a function of the quality of the target perception by the team. In the formation control loop, the controller module is a distributed non-linear model predictive controller and the estimator module fuses local estimates of the target state, obtained by a particle filter at each robot. The two modules and their integration are described in detail, including a real-time database associated to a wireless communication protocol that facilitates the exchange of state data while reducing collisions among team members. Simulation and real robot results for indoor and outdoor teams of different robots are presented. The results highlight how our method successfully enables a team of homogeneous robots to minimize the total uncertainty of the tracked target cooperative estimate while complying with performance criteria such as keeping a pre-set distance between the teammates and the target, avoiding collisions with teammates and/or surrounding obstacles.
Resumo:
O ensaio de dureza, e mais concretamente o ensaio de micro dureza Vickers, é no universo dos ensaios mecânicos um dos mais utilizados quer seja na indústria, no ensino ou na investigação e desenvolvimento de produto no âmbito das ciências dos materiais. Na grande maioria dos casos, a utilização deste ensaio tem como principal aplicação a caracterização ou controlo da qualidade de fabrico de materiais metálicos. Sendo um ensaio de relativa simplicidade de execução, rapidez e com resultados comparáveis e relacionáveis a outras grandezas físicas das propriedades dos materiais. Contudo, e tratando-se de um método de ensaio cuja intervenção humana é importante, na medição da indentação gerada por penetração mecânica através de um sistema ótico, não deixa de exibir algumas debilidades que daí advêm, como sendo o treino dos técnicos e respetivas acuidades visuais, fenómenos de fadiga visual que afetam os resultados ao longo de um turno de trabalho; ora estes fenómenos afetam a repetibilidade e reprodutibilidade dos resultados obtidos no ensaio. O CINFU possui um micro durómetro Vickers, cuja realização dos ensaios depende de um técnico treinado para a execução do mesmo, apresentando todas as debilidades já mencionadas e que o tornou elegível para o estudo e aplicação de uma solução alternativa. Assim, esta dissertação apresenta o desenvolvimento de uma solução alternativa ao método ótico convencional na medição de micro dureza Vickers. Utilizando programação em LabVIEW da National Instruments, juntamente com as ferramentas de visão computacional (NI Vision), o programa começa por solicitar ao técnico a seleção da câmara para aquisição da imagem digital acoplada ao micro durómetro, seleção do método de ensaio (Força de ensaio); posteriormente o programa efetua o tratamento da imagem (aplicação de filtros para eliminação do ruído de fundo da imagem original), segue-se, por indicação do operador, a zona de interesse (ROI) e por sua vez são identificadas automaticamente os vértices da calote e respetivas distâncias das diagonais geradas concluindo, após aceitação das mesmas, com o respetivo cálculo de micro dureza resultante. Para validação dos resultados foram utilizados blocos-padrão de dureza certificada (CRM), cujos resultados foram satisfatórios, tendo-se obtido um elevado nível de exatidão nas medições efetuadas. Por fim, desenvolveu-se uma folha de cálculo em Excel com a determinação da incerteza associada às medições de micro dureza Vickers. Foram então comparados os resultados nas duas metodologias possíveis, pelo método ótico convencional e pela utilização das ferramentas de visão computacional, tendo-se obtido bons resultados com a solução proposta.
Resumo:
Background: Several studies have shown that treatment with HMG-CoA reductase inhibitors (statins) can reduce coronary heart disease (CHD) rates. However, the cost effectiveness of statin treatment in the primary prevention of CHD has not been fully established. Objective: To estimate the costs of CHD prevention using statins in Switzerland according to different guidelines, over a 10-year period. Methods: The overall 10-year costs, costs of one CHD death averted, and of 1 year without CHD were computed for the European Society of Cardiology (ESC), the International Atherosclerosis Society (IAS), and the US Adult Treatment Panel III (ATP-III) guidelines. Sensitivity analysis was performed by varying number of CHD events prevented and costs of treatment. Results: Using an inflation rate of medical costs of 3%, a single yearly consultation, a single total cholesterol measurement per year, and a generic statin, the overall 10-year costs of the ESC, IAS, and ATP-III strategies were 2.2, 3.4, and 4.1 billion Swiss francs (SwF [SwF1 = $US0.97]). In this scenario, the average cost for 1 year of life gained was SwF352, SwF421, and SwF485 thousand, respectively, and it was always higher in women than in men. In men, the average cost for 1 year of life without CHD was SwF30.7, SwF42.5, and SwF51.9 thousand for the ESC, IAS, and ATP-III strategies, respectively, and decreased with age. Statin drug costs represented between 45% and 68% of the overall preventive cost. Changing the cost of statins, inflation rates, or number of fatal and non-fatal cases of CHD averted showed ESC guidelines to be the most cost effective. Conclusion: The cost of CHD prevention using statins depends on the guidelines used. The ESC guidelines appear to yield the lowest costs per year of life gained free of CHD.
Resumo:
The present work describes a fast gas chromatography/negative-ion chemical ionization tandem mass spectrometric assay (Fast GC/NICI-MS/MS) for analysis of tetrahydrocannabinol (THC), 11-hydroxy-tetrahydrocannabinol (THC-OH) and 11-nor-9-carboxy-tetrahydrocannabinol (THC-COOH) in whole blood. The cannabinoids were extracted from 500 microL of whole blood by a simple liquid-liquid extraction (LLE) and then derivatized by using trifluoroacetic anhydride (TFAA) and hexafluoro-2-propanol (HFIP) as fluorinated agents. Mass spectrometric detection of the analytes was performed in the selected reaction-monitoring mode on a triple quadrupole instrument after negative-ion chemical ionization. The assay was found to be linear in the concentration range of 0.5-20 ng/mL for THC and THC-OH, and of 2.5-100 ng/mL for THC-COOH. Repeatability and intermediate precision were found less than 12% for all concentrations tested. Under standard chromatographic conditions, the run cycle time would have been 15 min. By using fast conditions of separation, the assay analysis time has been reduced to 5 min, without compromising the chromatographic resolution. Finally, a simple approach for estimating the uncertainty measurement is presented.