106 resultados para ESTIMATING LIMIT LOADS

em CentAUR: Central Archive University of Reading - UK


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe and elsewhere. They are assessed by several elaborate and ingenious models, each of which requires many parameters, and have to be applied on a spatially-distributed basis. Often the values of the input parameters are poorly known, calling into question the validity of the calculated critical loads. This paper attempts to quantify the uncertainty in the critical loads due to this "parameter uncertainty", using examples from the UK. Models used for calculating critical loads for deposition of acidity and nitrogen in forest and heathland ecosystems were tested at four contrasting sites. Uncertainty was assessed by Monte Carlo methods. Each input parameter or variable was assigned a value, range and distribution in an objective a fashion as possible. Each model was run 5000 times at each site using parameters sampled from these input distributions. Output distributions of various critical load parameters were calculated. The results were surprising. Confidence limits of the calculated critical loads were typically considerably narrower than those of most of the input parameters. This may be due to a "compensation of errors" mechanism. The range of possible critical load values at a given site is however rather wide, and the tails of the distributions are typically long. The deposition reductions required for a high level of confidence that the critical load is not exceeded are thus likely to be large. The implication for pollutant regulation is that requiring a high probability of non-exceedance is likely to carry high costs. The relative contribution of the input variables to critical load uncertainty varied from site to site: any input variable could be important, and thus it was not possible to identify variables as likely targets for research into narrowing uncertainties. Sites where a number of good measurements of input parameters were available had lower uncertainties, so use of in situ measurement could be a valuable way of reducing critical load uncertainty at particularly valuable or disputed sites. From a restricted number of samples, uncertainties in heathland critical loads appear comparable to those of coniferous forest, and nutrient nitrogen critical loads to those of acidity. It was important to include correlations between input variables in the Monte Carlo analysis, but choice of statistical distribution type was of lesser importance. Overall, the analysis provided objective support for the continued use of critical loads in policy development. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed an ensemble Kalman Filter (EnKF) to estimate 8-day regional surface fluxes of CO2 from space-borne CO2 dry-air mole fraction observations (XCO2) and evaluate the approach using a series of synthetic experiments, in preparation for data from the NASA Orbiting Carbon Observatory (OCO). The 32-day duty cycle of OCO alternates every 16 days between nadir and glint measurements of backscattered solar radiation at short-wave infrared wavelengths. The EnKF uses an ensemble of states to represent the error covariances to estimate 8-day CO2 surface fluxes over 144 geographical regions. We use a 12×8-day lag window, recognising that XCO2 measurements include surface flux information from prior time windows. The observation operator that relates surface CO2 fluxes to atmospheric distributions of XCO2 includes: a) the GEOS-Chem transport model that relates surface fluxes to global 3-D distributions of CO2 concentrations, which are sampled at the time and location of OCO measurements that are cloud-free and have aerosol optical depths <0.3; and b) scene-dependent averaging kernels that relate the CO2 profiles to XCO2, accounting for differences between nadir and glint measurements, and the associated scene-dependent observation errors. We show that OCO XCO2 measurements significantly reduce the uncertainties of surface CO2 flux estimates. Glint measurements are generally better at constraining ocean CO2 flux estimates. Nadir XCO2 measurements over the terrestrial tropics are sparse throughout the year because of either clouds or smoke. Glint measurements provide the most effective constraint for estimating tropical terrestrial CO2 fluxes by accurately sampling fresh continental outflow over neighbouring oceans. We also present results from sensitivity experiments that investigate how flux estimates change with 1) bias and unbiased errors, 2) alternative duty cycles, 3) measurement density and correlations, 4) the spatial resolution of estimated flux estimates, and 5) reducing the length of the lag window and the size of the ensemble. At the revision stage of this manuscript, the OCO instrument failed to reach its orbit after it was launched on 24 February 2009. The EnKF formulation presented here is also applicable to GOSAT measurements of CO2 and CH4.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temporal and spatial patterns of soil water content affect many soil processes including evaporation, infiltration, ground water recharge, erosion and vegetation distribution. This paper describes the analysis of a soil moisture dataset comprising a combination of continuous time series of measurements at a few depths and locations, and occasional roving measurements at a large number of depths and locations. The objectives of the paper are: (i) to develop a technique for combining continuous measurements of soil water contents at a limited number of depths within a soil profile with occasional measurements at a large number of depths, to enable accurate estimation of the soil moisture vertical pattern and the integrated profile water content; and (ii) to estimate time series of soil moisture content at locations where there are just occasional soil water measurements available and some continuous records from nearby locations. The vertical interpolation technique presented here can strongly reduce errors in the estimation of profile soil water and its changes with time. On the other hand, the temporal interpolation technique is tested for different sampling strategies in space and time, and the errors generated in each case are compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate estimation of the soil water balance (SWB) is important for a number of applications (e.g. environmental, meteorological, agronomical and hydrological). The objective of this study was to develop and test techniques for the estimation of soil water fluxes and SWB components (particularly infiltration, evaporation and drainage below the root zone) from soil water records. The work presented here is based on profile soil moisture data measured using dielectric methods, at 30-min resolution, at an experimental site with different vegetation covers (barley, sunflower and bare soil). Estimates of infiltration were derived by assuming that observed gains in the soil profile water content during rainfall were due to infiltration. Inaccuracies related to diurnal fluctuations present in the dielectric-based soil water records are resolved by filtering the data with adequate threshold values. Inconsistencies caused by the redistribution of water after rain events were corrected by allowing for a redistribution period before computing water gains. Estimates of evaporation and drainage were derived from water losses above and below the deepest zero flux plane (ZFP), respectively. The evaporation estimates for the sunflower field were compared to evaporation data obtained with an eddy covariance (EC) system located elsewhere in the field. The EC estimate of total evaporation for the growing season was about 25% larger than that derived from the soil water records. This was consistent with differences in crop growth (based on direct measurements of biomass, and field mapping of vegetation using laser altimetry) between the EC footprint and the area of the field used for soil moisture monitoring. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe. The implementation of these policies involves large expenditures, and it is reasonable for policymakers to ask what degree of certainty can be attached to the underlying critical load and exceedance estimates. This paper is a literature review of studies which attempt to estimate the uncertainty attached to critical loads. Critical load models and uncertainty analysis are briefly outlined. Most studies have used Monte Carlo analysis of some form to investigate the propagation of uncertainties in the definition of the input parameters through to uncertainties in critical loads. Though the input parameters are often poorly known, the critical load uncertainties are typically surprisingly small because of a "compensation of errors" mechanism. These results depend on the quality of the uncertainty estimates of the input parameters, and a "pedigree" classification for these is proposed. Sensitivity analysis shows that some input parameters are more important in influencing critical load uncertainty than others, but there have not been enough studies to form a general picture. Methods used for dealing with spatial variation are briefly discussed. Application of alternative models to the same site or modifications of existing models can lead to widely differing critical loads, indicating that research into the underlying science needs to continue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports an uncertainty analysis of critical loads for acid deposition for a site in southern England, using the Steady State Mass Balance Model. The uncertainty bounds, distribution type and correlation structure for each of the 18 input parameters was considered explicitly, and overall uncertainty estimated by Monte Carlo methods. Estimates of deposition uncertainty were made from measured data and an atmospheric dispersion model, and hence the uncertainty in exceedance could also be calculated. The uncertainties of the calculated critical loads were generally much lower than those of the input parameters due to a "compensation of errors" mechanism - coefficients of variation ranged from 13% for CLmaxN to 37% for CL(A). With 1990 deposition, the probability that the critical load was exceeded was > 0.99; to reduce this probability to 0.50, a 63% reduction in deposition is required; to 0.05, an 82% reduction. With 1997 deposition, which was lower than that in 1990, exceedance probabilities declined and uncertainties in exceedance narrowed as deposition uncertainty had less effect. The parameters contributing most to the uncertainty in critical loads were weathering rates, base cation uptake rates, and choice of critical chemical value, indicating possible research priorities. However, the different critical load parameters were to some extent sensitive to different input parameters. The application of such probabilistic results to environmental regulation is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.