990 resultados para uncertainty evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With continuous new improvements in brachytherapy source designs and techniques, method of 3D dosimetry for treatment dose verifications would better ensure accurate patient radiotherapy treatment. This study was aimed to first evaluate the 3D dose distributions of the low-dose rate (LDR) Amersham 6711 OncoseedTM using PRESAGE® dosimeters to establish PRESAGE® as a suitable brachytherapy dosimeter. The new AgX100 125I seed model (Theragenics Corporation) was then characterized using PRESAGE® following the TG-43 protocol. PRESAGE® dosimeters are solid, polyurethane-based, 3D dosimeters doped with radiochromic leuco dyes that produce a linear optical density response to radiation dose. For this project, the radiochromic response in PRESAGE® was captured using optical-CT scanning (632 nm) and the final 3D dose matrix was reconstructed using the MATLAB software. An Amersham 6711 seed with an air-kerma strength of approximately 9 U was used to irradiate two dosimeters to 2 Gy and 11 Gy at 1 cm to evaluate dose rates in the r=1 cm to r=5 cm region. The dosimetry parameters were compared to the values published in the updated AAPM Report No. 51 (TG-43U1). An AgX100 seed with an air-kerma strength of about 6 U was used to irradiate two dosimeters to 3.6 Gy and 12.5 Gy at 1 cm. The dosimetry parameters for the AgX100 were compared to the values measured from previous Monte-Carlo and experimental studies. In general, the measured dose rate constant, anisotropy function, and radial dose function for the Amersham 6711 showed agreements better than 5% compared to consensus values in the r=1 to r=3 cm region. The dose rates and radial dose functions measured for the AgX100 agreed with the MCNPX and TLD-measured values within 3% in the r=1 to r=3 cm region. The measured anisotropy function in PRESAGE® showed relative differences of up to 9% with the MCNPX calculated values. It was determined that post-irradiation optical density change over several days was non-linear in different dose regions, and therefore the dose values in the r=4 to r=5 cm regions had higher uncertainty due to this effect. This study demonstrated that within the radial distance of 3 cm, brachytherapy dosimetry in PRESAGE® can be accurate within 5% as long as irradiation times are within 48 hours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection of metrics for ecosystem restoration programs is critical for improving the quality of monitoring programs and characterizing project success. Moreover it is oftentimes very difficult to balance the importance of multiple ecological, social, and economical metrics. Metric selection process is a complex and must simultaneously take into account monitoring data, environmental models, socio-economic considerations, and stakeholder interests. We propose multicriteria decision analysis (MCDA) methods, broadly defined, for the selection of optimal sets of metrics to enhance evaluation of ecosystem restoration alternatives. Two MCDA methods, a multiattribute utility analysis (MAUT), and a probabilistic multicriteria acceptability analysis (ProMAA), are applied and compared for a hypothetical case study of a river restoration involving multiple stakeholders. Overall, the MCDA results in a systematic, unbiased, and transparent solution, informing restoration alternatives evaluation. The two methods provide comparable results in terms of selected metrics. However, because ProMAA can consider probability distributions for weights and utility values of metrics for each criteria, it is suggested as the best option if data uncertainty is high. Despite the increase in complexity in the metric selection process, MCDA improves upon the current ad-hoc decision practice based on the consultations with stakeholders and experts, and encourages transparent and quantitative aggregation of data and judgement, increasing the transparency of decision making in restoration projects. We believe that MCDA can enhance the overall sustainability of ecosystem by enhancing both ecological and societal needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new methodology to build parametric models to estimate global solar irradiation adjusted to specific on-site characteristics based on the evaluation of variable im- portance. Thus, those variables higly correlated to solar irradiation on a site are implemented in the model and therefore, different models might be proposed under different climates. This methodology is applied in a study case in La Rioja region (northern Spain). A new model is proposed and evaluated on stability and accuracy against a review of twenty-two already exist- ing parametric models based on temperatures and rainfall in seventeen meteorological stations in La Rioja. The methodology of model evaluation is based on bootstrapping, which leads to achieve a high level of confidence in model calibration and validation from short time series (in this case five years, from 2007 to 2011). The model proposed improves the estimates of the other twenty-two models with average mean absolute error (MAE) of 2.195 MJ/m2 day and average confidence interval width (95% C.I., n=100) of 0.261 MJ/m2 day. 41.65% of the daily residuals in the case of SIAR and 20.12% in that of SOS Rioja fall within the uncertainty tolerance of the pyranometers of the two networks (10% and 5%, respectively). Relative differences between measured and estimated irradiation on an annual cumulative basis are below 4.82%. Thus, the proposed model might be useful to estimate annual sums of global solar irradiation, reaching insignificant differences between measurements from pyranometers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A participatory modelling process has been conducted in two areas of the Guadiana river (the upper and the middle sub-basins), in Spain, with the aim of providing support for decision making in the water management field. The area has a semi-arid climate where irrigated agriculture plays a key role in the economic development of the region and accounts for around 90% of water use. Following the guidelines of the European Water Framework Directive, we promote stakeholder involvement in water management with the aim to achieve an improved understanding of the water system and to encourage the exchange of knowledge and views between stakeholders in order to help building a shared vision of the system. At the same time, the resulting models, which integrate the different sectors and views, provide some insight of the impacts that different management options and possible future scenarios could have. The methodology is based on a Bayesian network combined with an economic model and, in the middle Guadiana sub-basin, with a crop model. The resulting integrated modelling framework is used to simulate possible water policy, market and climate scenarios to find out the impacts of those scenarios on farm income and on the environment. At the end of the modelling process, an evaluation questionnaire was filled by participants in both sub-basins. Results show that this type of processes are found very helpful by stakeholders to improve the system understanding, to understand each others views and to reduce conflict when it exists. In addition, they found the model an extremely useful tool to support management. The graphical interface, the quantitative output and the explicit representation of uncertainty helped stakeholders to better understand the implications of the scenario tested. Finally, the combination of different types of models was also found very useful, as it allowed exploring in detail specific aspects of the water management problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of dimensional metrology, the use of optical measuring machines requires the handling of a large number of measurement points, or scanning points, taken from the image of the measurand. The presence of correlation between these measurement points has a significant influence on the uncertainty of the result. The aim of this work is the development of an estimation procedure for the uncertainty of measurement in a geometrically elliptical shape, taking into account the correlation between the scanning points. These points are obtained from an image produced using a commercial flat bed scanner. The characteristic parameters of the ellipse (coordinates of the center, semi-axes and the angle of the semi-major axis with regard to the horizontal) are determined using a least squares fit and orthogonal distance regression. The uncertainty is estimated using the information from the auto-correlation function of the residuals and is propagated through the fitting algorithm according to the rules described in Evaluation of Measurement Data—Supplement 2 to the ‘Guide to the Expression of Uncertainty in Measurement’—Extension to any number of output quantities. By introducing the concept of cut-off length, it can be observed how it is possible to take into account the presence of the correlation in the estimation of uncertainty in a very simple way while avoiding underestimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the first application of total-reflection X-ray fluorescence (TXRF) spectrometry, a new and powerful alternative analytical method, to evaluation of the bioaccumulation kinetics of gold nanorods (GNRs) in various tissues upon intravenous administration in mice. The analytical parameters for developed methodology by TXRF were evaluated by means of the parallel analysis of bovine liver certified reference material samples (BCR-185R) doped with 10 μg/g gold. The average values (n = 5) achieved for gold measurements in lyophilized tissue weight were as follows: recovery 99.7%, expanded uncertainty (k = 2) 7%, repeatability 1.7%, detection limit 112 ng/g, and quantification limit 370 ng/g. The GNR bioaccumulation kinetics was analyzed in several vital mammalian organs such as liver, spleen, brain, and lung at different times. Additionally, urine samples were analyzed to study the kinetics of elimination of the GNRs by this excretion route. The main achievement was clearly differentiating two kinds of behaviors. GNRs were quickly bioaccumulated by highly vascular filtration organs such as liver and spleen, while GNRs do not show a bioaccumulation rates in brain and lung for the period of time investigated. In parallel, urine also shows a lack of GNR accumulation. TXRF has proven to be a powerful, versatile, and precise analytical technique for the evaluation of GNRs content in biological systems and, in a more general way, for any kind of metallic nanoparticles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between saturated hydraulic conductivity (Ks) and grain-size distribution was evaluated for 49 sites underlain by either glacially over consolidated or normally consolidated fluvio-glacial deposits in the Puget Lowland. A linear regression comprising pairs of grain-size analyses and pilot infiltration tests predicts Ks with a 1 sigma uncertainty of a factor of about 3.5 with 70% of the population variance accounted for. The correlation coefficient R^2 of about 0.90 shows that there is a strong correlation between the grain-size distribution and Ks. In contrast, a widely applied analysis proposed by Massmann (2003) explains only 20% of the population variance for normally consolidated materials with an R^2 of only 0.15. That analysis entirely fails to explain the population variance for over consolidated materials. The method developed in this study is recommended for determination of Ks for fluvio-glacial deposits of the Puget Lowland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate whether the introduction of a national, co-ordinated screening program using the faecal occult blood test represents 'value-for-money' from the perspective of the Australian Government as third-party funder. Methods: The annual equivalent costs and consequences of a biennial screening program in 'steady-state' operation were estimated for the Australian population using 1996 as the reference year. Disability-adjusted life years (DALYs) and the years of life lost (YLLs) averted, and the health service costs were modelled, based on the epidemiology and the costs of colorectal cancer in Australia together with the mortality reduction achieved in randomised controlled trials. Uncertainty in the model was examined using Monte Carlo simulation methods. Results: We estimate a minimum or 'base program' of screening those aged 55 to 69 years could avert 250 deaths per annum (95% uncertainty interval 99-400), at a gross cost of $A55 million (95% UI $A46 million to $A96 million) and a gross incremental cost-effectiveness ratio of $A17,000/DALY (95% UI $A13,000/DALY to $A52,000/DALY). Extending the program to include 70 to 74-year-olds is a more effective option (cheaper and higher health gain) than including the 50 to 54-year-olds. Conclusions: The findings of this study support the case for a national program directed at the 55 to 69-year-old age group with extension to 70 to 74-year-olds if there are sufficient resources. The pilot tests recently announced in Australia provide an important opportunity to consider the age range for screening and the sources of uncertainty, identified in the modelled evaluation, to assist decisions on implementing a full national program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To evaluate an investment project in the competitive electricity market, there are several key factors that affects the project's value: the present value that the project could bring to investor, the possible future course of actions that investor has and the project's management flexibility. The traditional net present value (NPV) criteria has the ability to capture the present value of the project's future cash flow, but it fails to assess the value brought by market uncertainty and management flexibility. By contrast with NPV, the real options approach (ROA) method has the advantage to combining the uncertainty and flexibility in evaluation process. In this paper, a framework for using ROA to evaluate the generation investment opportunity has been proposed. By given a detailed case study, the proposed framework is compared with NPV and showing a different results