990 resultados para uncertainty evaluation
Resumo:
Pressurized re-entrant (or 4 pi) ionization chambers (ICs) connected to current-measuring electronics are used for activity measurements of photon emitting radionuclides and some beta emitters in the fields of metrology and nuclear medicine. As a secondary method, these instruments need to be calibrated with appropriate activity standards from primary or direct standardization. The use of these instruments over 50 years has been well described in numerous publications, such as the Monographie BIPM-4 and the special issue of Metrologia on radionuclide metrology (Ratel 2007 Metrologia 44 S7-16, Schrader1997 Activity Measurements With Ionization Chambers (Monographie BIPM-4) Schrader 2007 Metrologia 44 S53-66, Cox et al 2007 Measurement Modelling of the International Reference System (SIR) for Gamma-Emitting Radionuclides (Monographie BIPM-7)). The present work describes the principles of activity measurements, calibrations, and impurity corrections using pressurized ionization chambers in the first part and the uncertainty analysis illustrated with example uncertainty budgets from routine source-calibration as well as from an international reference system (SIR) measurement in the second part.
Resumo:
This paper describes a method of uncertainty evaluation for axi-symmetric measurement machines which is compliant with GUM and PUMA methodologies. Specialized measuring machines for the inspection of axisymmetric components enable the measurement of properties such as roundness (radial runout), axial runout and coning. These machines typically consist of a rotary table and a number of contact measurement probes located on slideways. Sources of uncertainty include the probe calibration process, probe repeatability, probe alignment, geometric errors in the rotary table, the dimensional stability of the structure holding the probes and form errors in the reference hemisphere which is used to calibrate the system. The generic method is described and an evaluation of an industrial machine is described as a worked example. Type A uncertainties were obtained from a repeatability study of the probe calibration process, a repeatability study of the actual measurement process, a system stability test and an elastic deformation test. Type B uncertainties were obtained from calibration certificates and estimates. Expanded uncertainties, at 95% confidence, were then calculated for the measurement of; radial runout (1.2 µm with a plunger probe or 1.7 µm with a lever probe); axial runout (1.2 µm with a plunger probe or 1.5 µm with a lever probe); and coning/swash (0.44 arc seconds with a plunger probe or 0.60 arc seconds with a lever probe).
Resumo:
The paper discusses the evaluation of the uncertainty of a multivariate quantity using the Law of Propagation of Uncertainty defined in the Guide to the Expression of Uncertainty in Measurement (GUM) and a Monte Carlo method according to the GUM’s Supplement 2. The quantity analysed is the electrical impedance, which is not a scalar but a complex quantity. The used measuring method allows the evaluation of the impedance and of its uncertainty in different ways and the corresponding results are presented, compared and discussed. For comparison purposes, results of the impedance uncertainty obtained using the NIST Uncertainty Machine are also presented.
Resumo:
The main goals of the present work are the evaluation of the influence of several variables and test parameters on the melt flow index (MFI) of thermoplastics, and the determination of the uncertainty associated with the measurements. To evaluate the influence of test parameters on the measurement of MFI the design of experiments (DOE) approach has been used. The uncertainty has been calculated using a "bottom-up" approach given in the "Guide to the Expression of the Uncertainty of Measurement" (GUM). Since an analytical expression relating the output response (MFI) with input parameters does not exist, it has been necessary to build mathematical models by adjusting the experimental observations of the response variable in accordance with each input parameter. Subsequently, the determination of the uncertainty associated with the measurement of MFI has been performed by applying the law of propagation of uncertainty to the values of uncertainty of the input parameters. Finally, the activation energy (Ea) of the melt flow at around 200 degrees C and the respective uncertainty have also been determined.
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.
Resumo:
Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
Tässä diplomityössä esitetään selvitys käytössä olevista deterministisistä turvallisuusanalyysimenetelmistä. Deterministisillä turvallisuusanalyyseillä arvioidaan ydinvoimalaitosten turvallisuutta eri käyttötilojen aikana. Voimalaitoksen turvallisuusjärjestelmät mitoitetaan deterministisen turvallisuusanalyysin tulosten perusteella. Deterministiset turvallisuusanalyysit voidaan laatia konservatiivista tai tilastollista menetelmää käyttäen. Konservatiivinen menetelmä pyrkii mallintamaan tarkasteltavan tilanteen siten, että laitoksen todellinen käyttäytyminen on hyvällä varmuudella lievempää kuin analyysitulos. Konservatiivisessa menetelmässä analyysin epävarmuudet huomioidaan konservatiivisilla oletuksilla. Tilastollinen menetelmä perustuu parhaan arvion menetelmään eli pyrkimykseen mallintaa laitoksen käyttäytyminen mahdollisimman todenmukaisesti. Tilastollisessa menetelmässä analyysin epävarmuudet määritetään systemaattisesti tilastomatematiikan keinoin. Työssä painotetaan tilastollisen analyysin epävarmuuksien määritykseen käytettäviä epävarmuustarkastelumenetelmiä. Diplomityön laskennallisessa osassa vertaillaan deterministisen turvallisuusanalyysin laadintaan käytettäviä menetelmiä termohydraulisen turvallisuusanalyysiesimerkin laskennan kautta. Laskennassa tarkasteltavana onnettomuutena on Olkiluoto 3-laitosyksikössä tapahtuva primäärijäähdytepiirin putkikatkosta aiheutuva jäähdytteenmenetysonnettomuus. Lasketun esimerkkitapauksen perusteella tilastollista ja konservatiivista menetelmää voidaan pitää vaihtoehtoisina turvallisuusanalyysin laadintaan. Molemmat analyysit tuottivat hyväksyttäviä ja toisilleen verrannollisia tuloksia, joiden suuruusluokka on sama.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Wood-based bioprocesses present one of the fields of interest with the most potential in the circular economy. Expanding the use of wood raw material in sustainable industrial processes is acknowledged on both a global and a regional scale. This thesis concerns the application of a capillary zone electrophoresis (CZE) method with the aim of monitoring wood-based bioprocesses. The range of detectable carbohydrate compounds is expanded to furfural and polydatin in aquatic matrices. The experimental portion has been conducted on a laboratory scale with samples imitating process samples. This thesis presents a novel strategy for the uncertainty evaluation via in-house validation. The focus of the work is on the uncertainty factors of the CZE method. The CZE equipment is sensitive to ambient conditions. Therefore, a proper validation is essential for robust application. This thesis introduces a tool for process monitoring of modern bioprocesses. As a result, it is concluded that the applied CZE method provides additional results to the analysed samples and that the profiling approach is suitable for detecting changes in process samples. The CZE method shows significant potential in process monitoring because of the capability of simultaneously detecting carbohydrate-related compound clusters. The clusters can be used as summary terms, indicating process variation and drift.
Resumo:
Este trabalho é referente ao desenvolvimento de um calibrador multiobjetivo automático do modelo SWMM (Storm Water Management Model), e avaliação de algumas fontes de incertezas presentes no processo de calibração, visando à representação satisfatória da transformação chuva-vazão. O código foi escrito em linguagem C, e aplica os conceitos do método de otimização multiobjetivo NSGAII (Non Dominated Sorting Genetic Algorithm) com elitismo controlado, além de utilizar o código fonte do modelo SWMM para a determinação das vazões simuladas. Paralelamente, também foi criada uma interface visual, para melhorar a facilidade de utilização do calibrador. Os testes do calibrador foram aplicados a três sistemas diferentes: um sistema hipotético disponibilizado no pacote de instalação do SWMM; um sistema real de pequenas dimensões, denominado La Terraza, localizado no município de Sierra Vista, Arizona (EUA); e um sistema de maiores dimensões, a bacia hidrográfica do Córrego do Gregório, localizada no município de São Carlos (SP). Os resultados indicam que o calibrador construído apresenta, em geral, eficiência satisfatória, porém é bastante dependente da qualidade dos dados observados em campo e dos parâmetros de entrada escolhidos pelo usuário. Foi demonstrada a importância da escolha dos eventos utilizados na calibração, do estabelecimento de limites adequados nos valores das variáveis de decisão, da escolha das funções objetivo e, principalmente, da qualidade e representatividade dos dados de monitoramento pluvio e fluviométrico. Conclui-se que estes testes desenvolvidos contribuem para o entendimento mais aprofundado dos processos envolvidos na modelagem e calibração, possibilitando avanços na confiabilidade dos resultados da modelagem.
Resumo:
In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.
Resumo:
Les enjeux hydrologiques modernes, de prévisions ou liés aux changements climatiques, forcent l’exploration de nouvelles approches en modélisation afin de combler les lacunes actuelles et d’améliorer l’évaluation des incertitudes. L’approche abordée dans ce mémoire est celle du multimodèle (MM). L’innovation se trouve dans la construction du multimodèle présenté dans cette étude : plutôt que de caler individuellement des modèles et d’utiliser leur combinaison, un calage collectif est réalisé sur la moyenne des 12 modèles globaux conceptuels sélectionnés. Un des défis soulevés par cette approche novatrice est le grand nombre de paramètres (82) qui complexifie le calage et l’utilisation, en plus d’entraîner des problèmes potentiels d’équifinalité. La solution proposée dans ce mémoire est une analyse de sensibilité qui permettra de fixer les paramètres peu influents et d’ainsi réduire le nombre de paramètres total à caler. Une procédure d’optimisation avec calage et validation permet ensuite d’évaluer les performances du multimodèle et de sa version réduite en plus d’en améliorer la compréhension. L’analyse de sensibilité est réalisée avec la méthode de Morris, qui permet de présenter une version du MM à 51 paramètres (MM51) tout aussi performante que le MM original à 82 paramètres et présentant une diminution des problèmes potentiels d’équifinalité. Les résultats du calage et de la validation avec le « Split-Sample Test » (SST) du MM sont comparés avec les 12 modèles calés individuellement. Il ressort de cette analyse que les modèles individuels, composant le MM, présentent de moins bonnes performances que ceux calés indépendamment. Cette baisse de performances individuelles, nécessaire pour obtenir de bonnes performances globales du MM, s’accompagne d’une hausse de la diversité des sorties des modèles du MM. Cette dernière est particulièrement requise pour les applications hydrologiques nécessitant une évaluation des incertitudes. Tous ces résultats mènent à une amélioration de la compréhension du multimodèle et à son optimisation, ce qui facilite non seulement son calage, mais également son utilisation potentielle en contexte opérationnel.
Resumo:
Doctoral Thesis for PhD degree in Industrial and Systems Engineering
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.