31 resultados para Orion DBMS, Database, Uncertainty, Uncertain values, Benchmark

em CentAUR: Central Archive University of Reading - UK


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Future stratospheric ozone concentrations will be determined both by changes in the concentration of ozone depleting substances (ODSs) and by changes in stratospheric and tropospheric climate, including those caused by changes in anthropogenic greenhouse gases (GHGs). Since future economic development pathways and resultant emissions of GHGs are uncertain, anthropogenic climate change could be a significant source of uncertainty for future projections of stratospheric ozone. In this pilot study, using an "ensemble of opportunity" of chemistry-climate model (CCM) simulations, the contribution of scenario uncertainty from different plausible emissions pathways for ODSs and GHGs to future ozone projections is quantified relative to the contribution from model uncertainty and internal variability of the chemistry-climate system. For both the global, annual mean ozone concentration and for ozone in specific geographical regions, differences between CCMs are the dominant source of uncertainty for the first two-thirds of the 21st century, up-to and after the time when ozone concentrations return to 1980 values. In the last third of the 21st century, dependent upon the set of greenhouse gas scenarios used, scenario uncertainty can be the dominant contributor. This result suggests that investment in chemistry-climate modelling is likely to continue to refine projections of stratospheric ozone and estimates of the return of stratospheric ozone concentrations to pre-1980 levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Catastrophe risk models used by the insurance industry are likely subject to significant uncertainty, but due to their proprietary nature and strict licensing conditions they are not available for experimentation. In addition, even if such experiments were conducted, these would not be repeatable by other researchers because commercial confidentiality issues prevent the details of proprietary catastrophe model structures from being described in public domain documents. However, such experimentation is urgently required to improve decision making in both insurance and reinsurance markets. In this paper we therefore construct our own catastrophe risk model for flooding in Dublin, Ireland, in order to assess the impact of typical precipitation data uncertainty on loss predictions. As we consider only a city region rather than a whole territory and have access to detailed data and computing resources typically unavailable to industry modellers, our model is significantly more detailed than most commercial products. The model consists of four components, a stochastic rainfall module, a hydrological and hydraulic flood hazard module, a vulnerability module, and a financial loss module. Using these we undertake a series of simulations to test the impact of driving the stochastic event generator with four different rainfall data sets: ground gauge data, gauge-corrected rainfall radar, meteorological reanalysis data (European Centre for Medium-Range Weather Forecasts Reanalysis-Interim; ERA-Interim) and a satellite rainfall product (The Climate Prediction Center morphing method; CMORPH). Catastrophe models are unusual because they use the upper three components of the modelling chain to generate a large synthetic database of unobserved and severe loss-driving events for which estimated losses are calculated. We find the loss estimates to be more sensitive to uncertainties propagated from the driving precipitation data sets than to other uncertainties in the hazard and vulnerability modules, suggesting that the range of uncertainty within catastrophe model structures may be greater than commonly believed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe and elsewhere. They are assessed by several elaborate and ingenious models, each of which requires many parameters, and have to be applied on a spatially-distributed basis. Often the values of the input parameters are poorly known, calling into question the validity of the calculated critical loads. This paper attempts to quantify the uncertainty in the critical loads due to this "parameter uncertainty", using examples from the UK. Models used for calculating critical loads for deposition of acidity and nitrogen in forest and heathland ecosystems were tested at four contrasting sites. Uncertainty was assessed by Monte Carlo methods. Each input parameter or variable was assigned a value, range and distribution in an objective a fashion as possible. Each model was run 5000 times at each site using parameters sampled from these input distributions. Output distributions of various critical load parameters were calculated. The results were surprising. Confidence limits of the calculated critical loads were typically considerably narrower than those of most of the input parameters. This may be due to a "compensation of errors" mechanism. The range of possible critical load values at a given site is however rather wide, and the tails of the distributions are typically long. The deposition reductions required for a high level of confidence that the critical load is not exceeded are thus likely to be large. The implication for pollutant regulation is that requiring a high probability of non-exceedance is likely to carry high costs. The relative contribution of the input variables to critical load uncertainty varied from site to site: any input variable could be important, and thus it was not possible to identify variables as likely targets for research into narrowing uncertainties. Sites where a number of good measurements of input parameters were available had lower uncertainties, so use of in situ measurement could be a valuable way of reducing critical load uncertainty at particularly valuable or disputed sites. From a restricted number of samples, uncertainties in heathland critical loads appear comparable to those of coniferous forest, and nutrient nitrogen critical loads to those of acidity. It was important to include correlations between input variables in the Monte Carlo analysis, but choice of statistical distribution type was of lesser importance. Overall, the analysis provided objective support for the continued use of critical loads in policy development. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the success of studies attempting to integrate remotely sensed data and flood modelling and the need to provide near-real time data routinely on a global scale as well as setting up online data archives, there is to date a lack of spatially and temporally distributed hydraulic parameters to support ongoing efforts in modelling. Therefore, the objective of this project is to provide a global evaluation and benchmark data set of floodplain water stages with uncertainties and assimilation in a large scale flood model using space-borne radar imagery. An algorithm is developed for automated retrieval of water stages with uncertainties from a sequence of radar imagery and data are assimilated in a flood model using the Tewkesbury 2007 flood event as a feasibility study. The retrieval method that we employ is based on possibility theory which is an extension of fuzzy sets and that encompasses probability theory. In our case we first attempt to identify main sources of uncertainty in the retrieval of water stages from radar imagery for which we define physically meaningful ranges of parameter values. Possibilities of values are then computed for each parameter using a triangular ‘membership’ function. This procedure allows the computation of possible values of water stages at maximum flood extents along a river at many different locations. At a later stage in the project these data are then used in assimilation, calibration or validation of a flood model. The application is subsequently extended to a global scale using wide swath radar imagery and a simple global flood forecasting model thereby providing improved river discharge estimates to update the latter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate science is coming under increasing pressure to deliver projections of future climate change at spatial scales as small as a few kilometres for use in impacts studies. But is our understanding and modelling of the climate system advanced enough to offer such predictions? Here we focus on the Atlantic–European sector, and on the effects of greenhouse gas forcing on the atmospheric and, to a lesser extent, oceanic circulations. We review the dynamical processes which shape European climate and then consider how each of these leads to uncertainty in the future climate. European climate is unique in many regards, and as such it poses a unique challenge for climate prediction. Future European climate must be considered particularly uncertain because (i) the spread between the predictions of current climate models is still considerable and (ii) Europe is particularly strongly affected by several processes which are known to be poorly represented in current models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (λ, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966–1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of λ near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impacts of climate change on crop productivity are often assessed using simulations from a numerical climate model as an input to a crop simulation model. The precision of these predictions reflects the uncertainty in both models. We examined how uncertainty in a climate (HadAM3) and crop General Large-Area Model (GLAM) for annual crops model affects the mean and standard deviation of crop yield simulations in present and doubled carbon dioxide (CO2) climates by perturbation of parameters in each model. The climate sensitivity parameter (lambda, the equilibrium response of global mean surface temperature to doubled CO2) was used to define the control climate. Observed 1966-1989 mean yields of groundnut (Arachis hypogaea L.) in India were simulated well by the crop model using the control climate and climates with values of lambda near the control value. The simulations were used to measure the contribution to uncertainty of key crop and climate model parameters. The standard deviation of yield was more affected by perturbation of climate parameters than crop model parameters in both the present-day and doubled CO2 climates. Climate uncertainty was higher in the doubled CO2 climate than in the present-day climate. Crop transpiration efficiency was key to crop model uncertainty in both present-day and doubled CO2 climates. The response of crop development to mean temperature contributed little uncertainty in the present-day simulations but was among the largest contributors under doubled CO2. The ensemble methods used here to quantify physical and biological uncertainty offer a method to improve model estimates of the impacts of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the price effects of environmental certification on commercial real estate assets. It is argued that there are likely to be three main drivers of price differences between certified and noncertified buildings. These are additional occupier benefits, lower holding costs for investors and a lower risk premium. Drawing upon the CoStar database of U.S. commercial real estate assets, hedonic regression analysis is used to measure the effect of certification on both rent and price. The results suggest that, compared to buildings in the same submarkets, eco-certified buildings have both a rental and sale price premium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Valuation is the process of estimating price. The methods used to determine value attempt to model the thought processes of the market and thus estimate price by reference to observed historic data. This can be done using either an explicit model, that models the worth calculation of the most likely bidder, or an implicit model, that that uses historic data suitably adjusted as a short cut to determine value by reference to previous similar sales. The former is generally referred to as the Discounted Cash Flow (DCF) model and the latter as the capitalisation (or All Risk Yield) model. However, regardless of the technique used, the valuation will be affected by uncertainties. Uncertainty in the comparable data available; uncertainty in the current and future market conditions and uncertainty in the specific inputs for the subject property. These input uncertainties will translate into an uncertainty with the output figure, the estimate of price. In a previous paper, we have considered the way in which uncertainty is allowed for in the capitalisation model in the UK. In this paper, we extend the analysis to look at the way in which uncertainty can be incorporated into the explicit DCF model. This is done by recognising that the input variables are uncertain and will have a probability distribution pertaining to each of them. Thus buy utilising a probability-based valuation model (using Crystal Ball) it is possible to incorporate uncertainty into the analysis and address the shortcomings of the current model. Although the capitalisation model is discussed, the paper concentrates upon the application of Crystal Ball to the Discounted Cash Flow approach.