13 resultados para perceived environmental uncertainty

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has responded to the need for diagnostic reference tools explicitly linking the influence of environmental uncertainty and performance within the supply chain. Uncertainty is a key factor influencing performance and an important measure of the operating environment. We develop and demonstrate a novel reference methodology based on data envelopment analysis (DEA) for examining the performance of value streams within the supply chain with specific reference to the level of environmental uncertainty they face. In this paper, using real industrial data, 20 product supply value streams within the European automotive industry sector are evaluated. Two are found to be efficient. The peer reference groups for the underperforming value streams are identified and numerical improvement targets are derived. The paper demonstrates how DEA can be used to guide supply chain improvement efforts through role-model identification and target setting, in a way that recognises the multiple dimensions/outcomes of the supply chain process and the influence of its environmental conditions. We have facilitated the contextualisation of environmental uncertainty and its incorporation into a specific diagnostic reference tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the many models developed for phosphorus concentration prediction at differing spatial and temporal scales, there has been little effort to quantify uncertainty in their predictions. Model prediction uncertainty quantification is desirable, for informed decision-making in river-systems management. An uncertainty analysis of the process-based model, integrated catchment model of phosphorus (INCA-P), within the generalised likelihood uncertainty estimation (GLUE) framework is presented. The framework is applied to the Lugg catchment (1,077 km2), a River Wye tributary, on the England–Wales border. Daily discharge and monthly phosphorus (total reactive and total), for a limited number of reaches, are used to initially assess uncertainty and sensitivity of 44 model parameters, identified as being most important for discharge and phosphorus predictions. This study demonstrates that parameter homogeneity assumptions (spatial heterogeneity is treated as land use type fractional areas) can achieve higher model fits, than a previous expertly calibrated parameter set. The model is capable of reproducing the hydrology, but a threshold Nash-Sutcliffe co-efficient of determination (E or R 2) of 0.3 is not achieved when simulating observed total phosphorus (TP) data in the upland reaches or total reactive phosphorus (TRP) in any reach. Despite this, the model reproduces the general dynamics of TP and TRP, in point source dominated lower reaches. This paper discusses why this application of INCA-P fails to find any parameter sets, which simultaneously describe all observed data acceptably. The discussion focuses on uncertainty of readily available input data, and whether such process-based models should be used when there isn’t sufficient data to support the many parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports an uncertainty analysis of critical loads for acid deposition for a site in southern England, using the Steady State Mass Balance Model. The uncertainty bounds, distribution type and correlation structure for each of the 18 input parameters was considered explicitly, and overall uncertainty estimated by Monte Carlo methods. Estimates of deposition uncertainty were made from measured data and an atmospheric dispersion model, and hence the uncertainty in exceedance could also be calculated. The uncertainties of the calculated critical loads were generally much lower than those of the input parameters due to a "compensation of errors" mechanism - coefficients of variation ranged from 13% for CLmaxN to 37% for CL(A). With 1990 deposition, the probability that the critical load was exceeded was > 0.99; to reduce this probability to 0.50, a 63% reduction in deposition is required; to 0.05, an 82% reduction. With 1997 deposition, which was lower than that in 1990, exceedance probabilities declined and uncertainties in exceedance narrowed as deposition uncertainty had less effect. The parameters contributing most to the uncertainty in critical loads were weathering rates, base cation uptake rates, and choice of critical chemical value, indicating possible research priorities. However, the different critical load parameters were to some extent sensitive to different input parameters. The application of such probabilistic results to environmental regulation is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cyanobacteria (blue-green algae) blooms in water bodies present serious public health issues with attendant economic and ecological impacts. Llyn Tegid (Lake Bala) is an important conservation and amenity asset within Snowdonia National Park, Wales which since the mid-1990s has experienced multiple toxic cyanobacteria blooms threatening the ecology and tourism-dependent local economy. Multiple working hypotheses explain the emergence of this problem, including climate change, land management linked to increased nutrient flux, hydromorphological alterations or changing trophic structure - any of which may operate individually or cumulatively to impair lake function. This paper reports the findings of a sedimentfingerprinting study using dated lake cores to explore the linkages between catchment and lake management practices and the emergence of the algal blooms problem. Since 1900 AD lake bed sedimentation rates have varied from 0.06 to 1.07 g cm−2 yr−1, with a pronounced acceleration since the early 1980s. Geochemical analysis revealed increases in the concentrations of total phosphorus (TP), calcium and heavy metals such as zinc and lead consistent with eutrophication and a rising pollution burden, particularly since the late 1970s. An uncertainty-inclusive sedimentfingerprinting approach was used to apportion the relative fluxes from the major catchment land cover types of improved pasture, rough grazing, forestry and channel banks. This showed improved pasture and channel banks are the dominant diffuse sources of sediment in the catchment, though forestry sources were important historically. Conversion of rough grazing to improved grassland, coupled with intensified land management and year-round livestock grazing, is concluded to provide the principal source of rising TP levels. Lake Habitat Survey and particle size analysis of lake cores demonstrate the hydromorphological impact of the River Dee Regulation Scheme, which controls water level and periodically diverts flow into Llyn Tegid from the adjacent Afon Tryweryn catchment. This hydromorphological impact has also been most pronounced since the late 1970s. It is concluded that an integrated approach combining land management to reduce agricultural runoff allied to improved water level regulation enabling recovery of littoral macrophytes offers the greatest chance halting the on-going cyanobacteria issue in Llyn Tegid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine to what degree we can expect to obtain accurate temperature trends for the last two decades near the surface and in the lower troposphere. We compare temperatures obtained from surface observations and radiosondes as well as satellite-based measurements from the Microwave Soundings Units (MSU), which have been adjusted for orbital decay and non-linear instrument-body effects, and reanalyses from the European Centre for Medium-Range Weather Forecasts (ERA) and the National Centre for Environmental Prediction (NCEP). In regions with abundant conventional data coverage, where the MSU has no major influence on the reanalysis, temperature anomalies obtained from microwave sounders, radiosondes and from both reanalyses agree reasonably. Where coverage is insufficient, in particular over the tropical oceans, large differences are found between the MSU and either reanalysis. These differences apparently relate to changes in the satellite data availability and to differing satellite retrieval methodologies, to which both reanalyses are quite sensitive over the oceans. For NCEP, this results from the use of raw radiances directly incorporated into the analysis, which make the reanalysis sensitive to changes in the underlying algorithms, e.g. those introduced in August 1992. For ERA, the bias-correction of the one-dimensional variational analysis may introduce an error when the satellite relative to which the correction is calculated is biased itself or when radiances change on a time scale longer than a couple of months, e.g. due to orbit decay. ERA inhomogeneities are apparent in April 1985, October/November 1986 and April 1989. These dates can be identified with the replacements of satellites. It is possible that a negative bias in the sea surface temperatures (SSTs) used in the reanalyses may have been introduced over the period of the satellite record. This could have resulted from a decrease in the number of ship measurements, a concomitant increase in the importance of satellite-derived SSTs, and a likely cold bias in the latter. Alternately, a warm bias in SSTs could have been caused by an increase in the percentage of buoy measurements (relative to deeper ship intake measurements) in the tropical Pacific. No indications for uncorrected inhomogeneities of land surface temperatures could be found. Near-surface temperatures have biases in the boundary layer in both reanalyses, presumably due to the incorrect treatment of snow cover. The increase of near-surface compared to lower tropospheric temperatures in the last two decades may be due to a combination of several factors, including high-latitude near-surface winter warming due to an enhanced NAO and upper-tropospheric cooling due to stratospheric ozone decrease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remotely sensed land cover maps are increasingly used as inputs into environmental simulation models whose outputs inform decisions and policy-making. Risks associated with these decisions are dependent on model output uncertainty, which is in turn affected by the uncertainty of land cover inputs. This article presents a method of quantifying the uncertainty that results from potential mis-classification in remotely sensed land cover maps. In addition to quantifying uncertainty in the classification of individual pixels in the map, we also address the important case where land cover maps have been upscaled to a coarser grid to suit the users’ needs and are reported as proportions of land cover type. The approach is Bayesian and incorporates several layers of modelling but is straightforward to implement. First, we incorporate data in the confusion matrix derived from an independent field survey, and discuss the appropriate way to model such data. Second, we account for spatial correlation in the true land cover map, using the remotely sensed map as a prior. Third, spatial correlation in the mis-classification characteristics is induced by modelling their variance. The result is that we are able to simulate posterior means and variances for individual sites and the entire map using a simple Monte Carlo algorithm. The method is applied to the Land Cover Map 2000 for the region of England and Wales, a map used as an input into a current dynamic carbon flux model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is intended both as a contribution to the conceptual work on process in economic thought and as an attempt to connect a non-institutionalist, non-evolutionary thinker to it. The paper has two principal objectives: (i) to delineate a broad, philosophically grounded conception of what an economic process theory (EPT) is; and (ii) to locate the contributions of George Shackle within this broad conception of EPT. In pursuing these two objectives, I hope to draw out the originality and significance of Shackle’s economics with a particular emphasis on what he adds to process conceptions developed within other heterodox traditions such as institutional and evolutionary economics. I will also highlight some of the perceived limitations of Shackle’s approach and link them to the limitations of process philosophy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vine-growing in the Less-Favoured Areas of Greece is facing multiple challenges that might lead to its abandonment. In an attempt to maintain rural populations, Rural Development Schemes have been created that offer the opportunity to rural households to maintain or expand their farming businesses including vine-growing. This paper stems from a study that used data from a cross-sectional survey of 204 farmers to investigate how farming systems and farmers’ perception of corruption, amongst other socio-economic factors, affected their decisions to continue vine-growing through participation in Rural Development Schemes, in three remote Less-Favoured Areas of Greece. The Theory of Planned Behaviour was used to frame the research problem with the assumption being that an individual’s intention to participate in a Scheme is based on their prior beliefs about it. Data from the survey were reduced and simplified by the use of non-linear principal component analysis. The ensuing variables were used in selectivity corrected ordered probit models to reveal farmers’ attitudes towards viticulture and rural development. It was found that economic factors, perceived corruption and farmers’ attitudes were significant determinants on whether to participate in the Schemes. The research findings highlight the important role of perceived corruption and the need for policies that facilitate farmers’ access to decision making centres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Of the many sources of urban greenhouse gas (GHG) emissions, solid waste is the only one for which management decisions are undertaken primarily by municipal governments themselves and is hence often the largest component of cities’ corporate inventories. It is essential that decision-makers select an appropriate quantification methodology and have an appreciation of methodological strengths and shortcomings. This work compares four different waste emissions quantification methods, including Intergovernmental Panel on Climate Change (IPCC) 1996 guidelines, IPCC 2006 guidelines, U.S. Environmental Protection Agency (EPA) Waste Reduction Model (WARM), and the Federation of Canadian Municipalities- Partners for Climate Protection (FCM-PCP) quantification tool. Waste disposal data for the greater Toronto area (GTA) in 2005 are used for all methodologies; treatment options (including landfill, incineration, compost, and anaerobic digestion) are examined where available in methodologies. Landfill was shown to be the greatest source of GHG emissions, contributing more than three-quarters of total emissions associated with waste management. Results from the different landfill gas (LFG) quantification approaches ranged from an emissions source of 557 kt carbon dioxide equivalents (CO2e) (FCM-PCP) to a carbon sink of −53 kt CO2e (EPA WARM). Similar values were obtained between IPCC approaches. The IPCC 2006 method was found to be more appropriate for inventorying applications because it uses a waste-in-place (WIP) approach, rather than a methane commitment (MC) approach, despite perceived onerous data requirements for WIP. MC approaches were found to be useful from a planning standpoint; however, uncertainty associated with their projections of future parameter values limits their applicability for GHG inventorying. MC and WIP methods provided similar results in this case study; however, this is case specific because of similarity in assumptions of present and future landfill parameters and quantities of annual waste deposited in recent years being relatively consistent.