925 resultados para Uncertainty quantification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unobserved mortalities of nontarget species are among the most troubling and difficult issues associated with fishing, especially when those species are targeted by other fisheries. Of such concern are mortalities of crab species of the Bering Sea, which are exposed to bottom trawling from groundfish fisheries. Uncertainty in the management of these fisheries has been exacerbated by unknown mortality rates for crabs struck by trawls. In this study, the mortality rates for 3 species of commercially important crabs—red king crab, (Paralithodes camtschaticus), snow crab (Chionoecetes opilio) and southern Tanner crab (C. bairdi)—that encounter different components of bottom trawls were estimated through capture of crabs behind the bottom trawl and by evaluation of immediate and delayed mortalities. We used a reflex action mortality predictor to predict delayed mortalities. Estimated mortality rates varied by species and by the part of the trawl gear encountered. Red king crab were more vulnerable than snow or southern Tanner crabs. Crabs were more likely to die after encountering the footrope than the sweeps of the trawl, and higher death rates were noted for the side sections of the footrope than for the center footrope section. Mortality rates were ≤16%, except for red king crab that passed under the trawl wings (32%). Herding devices (sweeps) can expand greatly the area of seafloor from which flatfishes are captured, and they subject crabs in that additional area to lower (4–9%) mortality rates. Raising sweep cables off of the seafloor reduced red king crab mortality rates from 10% to 4%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of the ocean in the cycling of oxygenated volatile organic compounds (OVOCs) remains largely unanswered due to a paucity of datasets. We describe the method development of a membrane inlet-proton transfer reaction/mass spectrometer (MI-PTR/MS) as an efficient method of analysing methanol, acetaldehyde and acetone in seawater. Validation of the technique with water standards shows that the optimised responses are linear and reproducible. Limits of detection are 27 nM for methanol, 0.7 nM for acetaldehyde and 0.3 nM for acetone. Acetone and acetaldehyde concentrations generated by MI-PTR/MS are compared to a second, independent method based on purge and trap-gas chromatography/flame ionisation detection (P&T-GC/FID) and show excellent agreement. Chromatographic separation of isomeric species acetone and propanal permits correction to mass 59 signal generated by the PTR/MS and overcomes a known uncertainty in reporting acetone concentrations via mass spectrometry. A third bioassay technique using radiolabelled acetone further supported the result generated by this method. We present the development and optimisation of the MI-PTR/MS technique as a reliable and convenient tool for analysing seawater samples for these trace gases. We compare this method with other analytical techniques and discuss its potential use in improving the current understanding of the cycling of oceanic OVOCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An interlaboratory comparison (ILC) was conducted to evaluate the proficiency of multiple laboratories to quantify dimethylsulfide (DMS) in aqueous solution. Ten participating laboratories were each supplied with blind duplicate test solutions containing dimethylsulfoniopropionate hydrochloride (DMSP HCl) dissolved in acidified artificial seawater. The test solutions were prepared by the coordinating laboratory from a DMSP HCl reference material that was synthesized and purity certified for this purpose. A concentration range was specified for the test solutions and the participating laboratories were requested to dilute them as required for their analytical procedure, together with the addition of excess alkali under gas-tight conditions to convert the DMSP to DMS. Twenty-two DMS concentrations and their estimated expanded measurement uncertainties (95% confidence level) were received from the laboratories. With two exceptions, the within-laboratory variability was 5% or less and the between-laboratory variability was ~ 25%. The magnitude of expanded measurement uncertainties reported from all participants ranged from 1% to 33% relative to the result. The information gained from this pilot ILC indicated the need for further test sample distribution studies of this type so that participating laboratories can identify systematic errors in their analysis procedures and realistically evaluate their measurement uncertainty. The outcome of ILC studies provides insights into the comparability of data in the global surface seawater DMS database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying processes that shape species geographical ranges is a prerequisite for understanding environmental change. Currently, species distribution modelling methods do not offer credible statistical tests of the relative influence of climate factors and typically ignore other processes (e.g. biotic interactions and dispersal limitation). We use a hierarchical model fitted with Markov Chain Monte Carlo to combine ecologically plausible niche structures using regression splines to describe unimodal but potentially skewed response terms. We apply spatially explicit error terms that account for (and may help identify) missing variables. Using three example distributions of European bird species, we map model results to show sensitivity to change in each covariate. We show that the overall strength of climatic association differs between species and that each species has considerable spatial variation in both the strength of the climatic association and the sensitivity to climate change. Our methods are widely applicable to many species distribution modelling problems and enable accurate assessment of the statistical importance of biotic and abiotic influences on distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Of the many sources of urban greenhouse gas (GHG) emissions, solid waste is the only one for which management decisions are undertaken primarily by municipal governments themselves and is hence often the largest component of cities’ corporate inventories. It is essential that decision-makers select an appropriate quantification methodology and have an appreciation of methodological strengths and shortcomings. This work compares four different waste emissions quantification methods, including Intergovernmental Panel on Climate Change (IPCC) 1996 guidelines, IPCC 2006 guidelines, U.S. Environmental Protection Agency (EPA) Waste Reduction Model (WARM), and the Federation of Canadian Municipalities- Partners for Climate Protection (FCM-PCP) quantification tool. Waste disposal data for the greater Toronto area (GTA) in 2005 are used for all methodologies; treatment options (including landfill, incineration, compost, and anaerobic digestion) are examined where available in methodologies. Landfill was shown to be the greatest source of GHG emissions, contributing more than three-quarters of total emissions associated with waste management. Results from the different landfill gas (LFG) quantification approaches ranged from an emissions source of 557 kt carbon dioxide equivalents (CO2e) (FCM-PCP) to a carbon sink of −53 kt CO2e (EPA WARM). Similar values were obtained between IPCC approaches. The IPCC 2006 method was found to be more appropriate for inventorying applications because it uses a waste-in-place (WIP) approach, rather than a methane commitment (MC) approach, despite perceived onerous data requirements for WIP. MC approaches were found to be useful from a planning standpoint; however, uncertainty associated with their projections of future parameter values limits their applicability for GHG inventorying. MC and WIP methods provided similar results in this case study; however, this is case specific because of similarity in assumptions of present and future landfill parameters and quantities of annual waste deposited in recent years being relatively consistent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure for characterizing global uncertainty of a rainfall-runoff simulation model based on using grey numbers is presented. By using the grey numbers technique the uncertainty is characterized by an interval; once the parameters of the rainfall-runoff model have been properly defined as grey numbers, by using the grey mathematics and functions it is possible to obtain simulated discharges in the form of grey numbers whose envelope defines a band which represents the vagueness/uncertainty associated with the simulated variable. The grey numbers representing the model parameters are estimated in such a way that the band obtained from the envelope of simulated grey discharges includes an assigned percentage of observed discharge values and is at the same time as narrow as possible. The approach is applied to a real case study highlighting that a rigorous application of the procedure for direct simulation through the rainfall-runoff model with grey parameters involves long computational times. However, these times can be significantly reduced using a simplified computing procedure with minimal approximations in the quantification of the grey numbers representing the simulated discharges. Relying on this simplified procedure, the conceptual rainfall-runoff grey model is thus calibrated and the uncertainty bands obtained both downstream of the calibration process and downstream of the validation process are compared with those obtained by using a well-established approach, like the GLUE approach, for characterizing uncertainty. The results of the comparison show that the proposed approach may represent a valid tool for characterizing the global uncertainty associable with the output of a rainfall-runoff simulation model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common approach used to estimate landscape resistance involves comparing correlations of ecological and genetic distances calculated among individuals of a species. However, the location of sampled individuals may contain some degree of spatial uncertainty due to the natural variation of animals moving through their home range or measurement error in plant or animal locations. In this study, we evaluate the ways that spatial uncertainty, landscape characteristics, and genetic stochasticity interact to influence the strength and variability of conclusions about landscape-genetics relationships. We used a neutral landscape model to generate 45 landscapes composed of habitat and non-habitat, varying in percent habitat, aggregation, and structural connectivity (patch cohesion). We created true and alternate locations for 500 individuals, calculated ecological distances (least-cost paths), and simulated genetic distances among individuals. We compared correlations between ecological distances for true and alternate locations. We then simulated genotypes at 15 neutral loci and investigated whether the same influences could be detected in simple Mantel tests and while controlling for the effects of isolation-by distance using the partial Mantel test. Spatial uncertainty interacted with the percentage of habitat in the landscape, but led to only small reductions in correlations. Furthermore, the strongest correlations occurred with low percent habitat, high aggregation, and low to intermediate levels of cohesion. Overall genetic stochasticity was relatively low and was influenced by landscape characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characterization of soil CO2 emissions (FCO2) is important for the study of the global carbon cycle. This phenomenon presents great variability in space and time, a characteristic that makes attempts at modeling and forecasting FCO2 challenging. Although spatial estimates have been performed in several studies, the association of these estimates with the uncertainties inherent in the estimation procedures is not considered. This study aimed to evaluate the local, spatial, local-temporal and spatial-temporal uncertainties of short-term FCO2 after harvest period in a sugar cane area. The FCO2 was featured in a sampling grid of 60m×60m containing 127 points with minimum separation distances from 0.5 to 10m between points. The FCO2 was evaluated 7 times within a total period of 10 days. The variability of FCO2 was described by descriptive statistics and variogram modeling. To calculate the uncertainties, 300 realizations made by sequential Gaussian simulation were considered. Local uncertainties were evaluated using the probability values exceeding certain critical thresholds, while the spatial uncertainties considering the probability of regions with high probability values together exceed the adopted limits. Using the daily uncertainties, the local-spatial and spatial-temporal uncertainty (Ftemp) was obtained. The daily and mean emissions showed a variability structure that was described by spherical and Gaussian models. The differences between the daily maps were related to variations in the magnitude of FCO2, covering mean values ranging from 1.28±0.11μmolm-2s-1 (F197) to 1.82±0.07μmolm-2s-1 (F195). The Ftemp showed low spatial uncertainty coupled with high local uncertainty estimates. The average emission showed great spatial uncertainty of the simulated values. The evaluation of uncertainties associated with the knowledge of temporal and spatial variability is an important tool for understanding many phenomena over time, such as the quantification of greenhouse gases or the identification of areas with high crop productivity. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many designs for radioactive waste repositories, cement and clay will come into direct contact. The geochemical contrast between cement and clay will lead to mass fluxes across the interface, which consequently results in alteration of structural and transport properties of both materials that may affect the performance of the multi-barrier system. We present an experimental approach to study cement-clay interactions with a cell to accommodate small samples of cement and clay. The cell design allows both in situ measurement of water content across the sample using neutron radiography and measurement of transport parameters using through-diffusion tracer experiments. The aim of the high- resolution neutron radiography experiments was to monitor changes in water content (porosity) and their spatial extent. Neutron radiographs of several evolving cement-clay interfaces delivered quantitative data which allow resolving local water contents within the sample domain. In the present work we explored the uncertainties of the derived water contents with regard to various input parameters and with regard to the applied image correction procedures. Temporal variation of measurement conditions created absolute uncertainty of the water content in the order of ±0.1 (m3/m3), which could not be fully accounted for by correction procedures. Smaller relative changes in water content between two images can be derived by specific calibrations to two sample regions with different, invariant water contents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cryoablation for small renal tumors has demonstrated sufficient clinical efficacy over the past decade as a non-surgical nephron-sparing approach for treating renal masses for patients who are not surgical candidates. Minimally invasive percutaneous cryoablations have been performed with image guidance from CT, ultrasound, and MRI. During the MRI-guided cryoablation procedure, the interventional radiologist visually compares the iceball size on monitoring images with respect to the original tumor on separate planning images. The comparisons made during the monitoring step are time consuming, inefficient and sometimes lack the precision needed for decision making, requiring the radiologist to make further changes later in the procedure. This study sought to mitigate uncertainty in these visual comparisons by quantifying tissue response to cryoablation and providing visualization of the response during the procedure. Based on retrospective analysis of MR-guided cryoablation patient data, registration and segmentation algorithms were investigated and implemented for periprocedural visualization to deliver iceball position/size with respect to planning images registered within 3.3mm with at least 70% overlap and a quantitative logit model was developed to relate perfusion deficit in renal parenchyma visualized in verification images as a result of iceball size visualized in monitoring images. Through retrospective study of 20 patient cases, the relationship between likelihood of perfusion loss in renal parenchyma and distance within iceball was quantified and iteratively fit to a logit curve. Using the parameters from the logit fit, the margin for 95% perfusion loss likelihood was found to be 4.28 mm within the iceball. The observed margin corresponds well with the clinically accepted margin of 3-5mm within the iceball. In order to display the iceball position and perfusion loss likelihood to the radiologist, algorithms were implemented to create a fast segmentation and registration module which executed in under 2 minutes, within the clinically-relevant 3 minute monitoring period. Using 16 patient cases, the average Hausdorff distance was reduced from 10.1mm to 3.21 mm with average DSC increased from 46.6% to 82.6% before and after registration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the “with measures” scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. Implications: A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.