838 resultados para Uncertainty in Wind Energy
Resumo:
We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.
Resumo:
The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.
Resumo:
The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. Weather forecasts using multiple NWPs from various weather centres implemented on catchment hydrology can provide significantly improved early flood warning. The availability of global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble’ (TIGGE) offers a new opportunity for the development of state-of-the-art early flood forecasting systems. This paper presents a case study using the TIGGE database for flood warning on a meso-scale catchment (4062 km2) located in the Midlands region of England. For the first time, a research attempt is made to set up a coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE database. The study shows that precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial precipitation variability on such a comparatively small catchment, which indicates need to improve NWPs resolution and/or disaggregating techniques to narrow down the spatial gap between meteorology and hydrology. The spread of discharge forecasts varies from centre to centre, but it is generally large and implies a significant level of uncertainties. Nevertheless, the results show the TIGGE database is a promising tool to forecast flood inundation, comparable with that driven by raingauge observation.
Resumo:
Methods to explicitly represent uncertainties in weather and climate models have been developed and refined over the past decade, and have reduced biases and improved forecast skill when implemented in the atmospheric component of models. These methods have not yet been applied to the land surface component of models. Since the land surface is strongly coupled to the atmospheric state at certain times and in certain places (such as the European summer of 2003), improvements in the representation of land surface uncertainty may potentially lead to improvements in atmospheric forecasts for such events. Here we analyse seasonal retrospective forecasts for 1981–2012 performed with the European Centre for Medium-Range Weather Forecasts’ (ECMWF) coupled ensemble forecast model. We consider two methods of incorporating uncertainty into the land surface model (H-TESSEL): stochastic perturbation of tendencies, and static perturbation of key soil parameters. We find that the perturbed parameter approach considerably improves the forecast of extreme air temperature for summer 2003, through better representation of negative soil moisture anomalies and upward sensible heat flux. Averaged across all the reforecasts the perturbed parameter experiment shows relatively little impact on the mean bias, suggesting perturbations of at least this magnitude can be applied to the land surface without any degradation of model climate. There is also little impact on skill averaged across all reforecasts and some evidence of overdispersion for soil moisture. The stochastic tendency experiments show a large overdispersion for the soil temperature fields, indicating that the perturbation here is too strong. There is also some indication that the forecast of the 2003 warm event is improved for the stochastic experiments, however the improvement is not as large as observed for the perturbed parameter experiment.
Resumo:
Model-based estimates of future uncertainty are generally based on the in-sample fit of the model, as when Box-Jenkins prediction intervals are calculated. However, this approach will generate biased uncertainty estimates in real time when there are data revisions. A simple remedy is suggested, and used to generate more accurate prediction intervals for 25 macroeconomic variables, in line with the theory. A simulation study based on an empirically-estimated model of data revisions for US output growth is used to investigate small-sample properties.
Resumo:
Model simulations of the next few decades are widely used in assessments of climate change impacts and as guidance for adaptation. Their non-linear nature reveals a level of irreducible uncertainty which it is important to understand and quantify, especially for projections of near-term regional climate. Here we use large idealised initial condition ensembles of the FAMOUS global climate model with a 1 %/year compound increase in CO2 levels to quantify the range of future temperatures in model-based projections. These simulations explore the role of both atmospheric and oceanic initial conditions and are the largest such ensembles to date. Short-term simulated trends in global temperature are diverse, and cooling periods are more likely to be followed by larger warming rates. The spatial pattern of near-term temperature change varies considerably, but the proportion of the surface showing a warming is more consistent. In addition, ensemble spread in inter-annual temperature declines as the climate warms, especially in the North Atlantic. Over Europe, atmospheric initial condition uncertainty can, for certain ocean initial conditions, lead to 20 year trends in winter and summer in which every location can exhibit either strong cooling or rapid warming. However, the details of the distribution are highly sensitive to the ocean initial condition chosen and particularly the state of the Atlantic meridional overturning circulation. On longer timescales, the warming signal becomes more clear and consistent amongst different initial condition ensembles. An ensemble using a range of different oceanic initial conditions produces a larger spread in temperature trends than ensembles using a single ocean initial condition for all lead times. This highlights the potential benefits from initialising climate predictions from ocean states informed by observations. These results suggest that climate projections need to be performed with many more ensemble members than at present, using a range of ocean initial conditions, if the uncertainty in near-term regional climate is to be adequately quantified.
Resumo:
Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Primary beam spectra were obtained for an X-ray industrial equipment (40-150 kV), and for a clinical mammography apparatus (25-35 kV) from beams scattered at angles close to 90 degrees, measured with a CdTe Compton spectrometer. Actual scattering angles were determined from the Compton energy shift of characteristic X-rays or spectra end-point energy. Evaluated contribution of coherent scattering amounts to more than 15% of fluence in mammographic beams. This technique can be used in clinical environments. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We show that single and multislit experiments involving matter waves may be constructed to assess dispersively generated correlations between the position and momentum of a single free particle. These correlations give rise to position dependent phases which develop dynamically as a result of dispersion and may play an important role in the interference patterns. To the extent that initial transverse coherence is preserved throughout the proposed diffraction setup, such interference patterns are noticeably different from those of a classical dispersion free wave. (c) 2007 Published by Elsevier B.V.
Resumo:
We propose a model for the antihyperon polarization in high-energy proton-nucleus inclusive reactions, based on the final-state interactions between the antihyperons and other produced particles (predominantly pions). To formulate this idea, we use the previously obtained low-energy pion-(anti-)hyperon interaction using effective chiral Lagrangians, and a hydrodynamic parametrization of the background matter, which expands and decouples at a certain freezeout temperature.
Resumo:
Addressing building energy use is a pressing issue for building sector decision makers across Europe. In Sweden, some regions have adopted a target of reducing energy use in buildings by 50% until 2050. However, building codes currently do not support as ambitious objectives as these, and novel approaches to addressing energy use in buildings from a regional perspective are called for. The purpose of this licentiate thesis was to provide a deeper understanding of most relevant issues with regard to energy use in buildings from a broad perspective and to suggest pathways towards reaching the long-term savings objective. Current trends in building sector structure and energy use point to detached houses constructed before 1981 playing a key role in the energy transition, especially in the rural areas of Sweden. In the Swedish county of Dalarna, which was used as a study area in this thesis, these houses account for almost 70% of the residential heating demand. Building energy simulations of eight sample houses from county show that there is considerable techno-economic potential for energy savings in these houses, but not quite enough to reach the 50% savings objective. Two case studies from rural Sweden show that savings well beyond 50% are achievable, both when access to capital and use of high technology are granted and when they are not. However, on a broader scale both direct and indirect rebound effects will have to be expected, which calls for more refined approaches to energy savings. Furthermore, research has shown that the techno-economic potential is in fact never realised, not even in the most well-designed intervention programmes, due to the inherent complexity of human behaviour with respect to energy use. This is not taken account of in neither current nor previous Swedish energy use legislation. Therefore an approach that considers the technical prerequisites, economic aspects and the perspective of the many home owners, based on Community-Based Social Marketing methodology, is suggested as a way forward towards reaching the energy savings target.
Resumo:
Lucas (1987) has shown the surprising result that the welfare cost of business cycles is quite small. Using standard assumptions on preferences and a fully-áedged econometric model we computed the welfare costs of macroeconomic uncertainty for the post-WWII era using the multivariate Beveridge-Nelson decomposition for trends and cycles, which considers not only business-cycle uncertainty but also uncertainty from the stochastic trend in consumption. The post-WWII period is relatively quiet, with the welfare costs of uncertainty being about 0:9% of per-capita consumption. Although changing the decomposition method changed substantially initial results, the welfare cost of uncertainty is qualitatively small in the post-WWII era - about $175.00 a year per-capita in the U.S. We also computed the marginal welfare cost of macroeconomic uncertainty using this same technique. It is about twice as large as the welfare cost ñ$350.00 a year per-capita.
Resumo:
With standard assumptions on preferences and a fully-fledged econometric model we computed the welfare costs of macroeconomic uncertainty for post-war U.S. using the BeveridgeNelson decomposition. Welfare costs are about 0.9% per-capita consumption ($175.00) and marginal welfare costs are about twice as large.