899 resultados para Uncertainty of forecasts
Resumo:
Under increasing greenhouse gas concentrations, ocean heat uptake moderates the rate of climate change, and thermal expansion makes a substantial contribution to sea level rise. In this paper we quantify the differences in projections among atmosphere-ocean general circulation models of the Coupled Model Intercomparison Project in terms of transient climate response, ocean heat uptake efficiency and expansion efficiency of heat. The CMIP3 and CMIP5 ensembles have statistically indistinguishable distributions in these parameters. The ocean heat uptake efficiency varies by a factor of two across the models, explaining about 50% of the spread in ocean heat uptake in CMIP5 models with CO2 increasing at 1%/year. It correlates with the ocean global-mean vertical profiles both of temperature and of temperature change, and comparison with observations suggests the models may overestimate ocean heat uptake and underestimate surface warming, because their stratification is too weak. The models agree on the location of maxima of shallow ocean heat uptake (above 700 m) in the Southern Ocean and the North Atlantic, and on deep ocean heat uptake (below 2000 m) in areas of the Southern Ocean, in some places amounting to 40% of the top-to-bottom integral in the CMIP3 SRES A1B scenario. The Southern Ocean dominates global ocean heat uptake; consequently the eddy-induced thickness diffusivity parameter, which is particularly influential in the Southern Ocean, correlates with the ocean heat uptake efficiency. The thermal expansion produced by ocean heat uptake is 0.12 m YJ−1, with an uncertainty of about 10% (1 YJ = 1024 J).
Resumo:
We introduce an algorithm (called REDFITmc2) for spectrum estimation in the presence of timescale errors. It is based on the Lomb-Scargle periodogram for unevenly spaced time series, in combination with the Welch's Overlapped Segment Averaging procedure, bootstrap bias correction and persistence estimation. The timescale errors are modelled parametrically and included in the simulations for determining (1) the upper levels of the spectrum of the red-noise AR(1) alternative and (2) the uncertainty of the frequency of a spectral peak. Application of REDFITmc2 to ice core and stalagmite records of palaeoclimate allowed a more realistic evaluation of spectral peaks than when ignoring this source of uncertainty. The results support qualitatively the intuition that stronger effects on the spectrum estimate (decreased detectability and increased frequency uncertainty) occur for higher frequencies. The surplus information brought by algorithm REDFITmc2 is that those effects are quantified. Regarding timescale construction, not only the fixpoints, dating errors and the functional form of the age-depth model play a role. Also the joint distribution of all time points (serial correlation, stratigraphic order) determines spectrum estimation.
Resumo:
Some proponents of local knowledge, such as Sillitoe (2010), have expressed second thoughts about its capacity to effect development on the ‘revolutionary’ scale once predicted. Our argument in this article follows a similar route. Recent research into the management of livestock in South Africa makes clear that rural African livestock farmers experience uncertainty in relation to the control of stock diseases. State provision of veterinary services has been significantly reduced over the past decade. Both white and African livestock owners are to a greater extent left to their own devices. In some areas of animal disease management, African livestock owners have recourse to tried-and-tested local remedies, which are largely plant-based. But especially in the critical sphere of tick control, efficacious treatments are less evident, and livestock owners struggle to find adequate solutions to high tickloads. This is particularly important in South Africa in the early twenty-first century because land reform and the freedom to purchase land in the post-apartheid context affords African stockowners opportunities to expand livestock holdings. Our research suggests that the limits of local knowledge in dealing with ticks is one of the central problems faced by African livestock owners. We judge this not only in relation to efficacy but also the perceptions of livestock owners themselves. While confidence and practice varies, and there is increasing resort of chemical acaricides we were struck by the uncertainty of livestock owners over the best strategies.
Resumo:
This paper investigates the application and use of development viability models in the formation of planning policies in the UK. Particular attention is paid to three key areas; the assumed development scheme in development viability models, the use of forecasts and the debate concerning Threshold Land Value. The empirical section reports on the results of an interview survey involving the main producers of development viability models and appraisals. It is concluded that, although development viability models have intrinsic limitations associated with model composition and input uncertainties, the most significant limitations are related to the ways that they have been adapted for use in the planning system. In addition, it is suggested that the contested nature of Threshold Land Value is an example of calculative practices providing a façade of technocratic rationality in the planning system.
Resumo:
Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.
Resumo:
Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.
Resumo:
The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.
Resumo:
Wind generated waves at the sea surface are of outstanding importance for both their practical relevance in many aspects, such as coastal erosion, protection, or safety of navigation, and for their scientific relevance in modifying fluxes at the air-sea interface. So far long-term changes in ocean wave climate have been studied mostly from a regional perspective with global dynamical studies emerging only recently. Here a global wave climate study is presented, in which a global wave model (WAM) is driven by atmospheric forcing from a global climate model (ECHAM5) for present day and potential future climate conditions represented by the IPCC (Intergovernmental Panel for Climate Change) A1B emission scenario. It is found that changes in mean and extreme wave climate towards the end of the twenty-first century are small to moderate, with the largest signals being a poleward shift in the annual mean and extreme significant wave heights in the mid-latitudes of both hemispheres, more pronounced in the Southern Hemisphere, and most likely associated with a corresponding shift in mid-latitude storm tracks. These changes are broadly consistent with results from the few studies available so far. The projected changes in the mean wave periods, associated with the changes in the wave climate in the mid to high latitudes, are also shown, revealing a moderate increase in the equatorial eastern side of the ocean basins. This study presents a step forward towards a larger ensemble of global wave climate projections required to better assess robustness and uncertainty of potential future wave climate change.
Resumo:
Two aircraft instruments for the measurement of total odd nitrogen (NOy) were compared side by side aboard a Learjet A35 in April 2003 during a campaign of the AFO2000 project SPURT (Spurengastransport in der Tropopausenregion). The instruments albeit employing the same measurement principle (gold converter and chemiluminescence) had different inlet configurations. The ECO-Physics instrument operated by ETH-Zürich in SPURT had the gold converter mounted outside the aircraft, whereas the instrument operated by FZ-Jülich in the European project MOZAIC III (Measurements of ozone, water vapour, carbon monoxide and nitrogen oxides aboard Airbus A340 in-service aircraft) employed a Rosemount probe with 80 cm of FEP-tubing connecting the inlet to the gold converter. The NOy concentrations during the flight ranged between 0.3 and 3 ppb. The two data sets were compared in a blind fashion and each team followed its normal operating procedures. On average, the measurements agreed within 7%, i.e. within the combined uncertainty of the two instruments. This puts an upper limit on potential losses of HNO3 in the Rosemount inlet of the MOZAIC instrument. Larger transient deviations were observed during periods after calibrations and when the aircraft entered the stratosphere. The time lag of the MOZAIC instrument observed in these instances is in accordance with the time constant of the MOZAIC inlet line determined in the laboratory for HNO3.
Resumo:
We consider tests of forecast encompassing for probability forecasts, for both quadratic and logarithmic scoring rules. We propose test statistics for the null of forecast encompassing, present the limiting distributions of the test statistics, and investigate the impact of estimating the forecasting models' parameters on these distributions. The small-sample performance is investigated, in terms of small numbers of forecasts and model estimation sample sizes. We show the usefulness of the tests for the evaluation of recession probability forecasts from logit models with different leading indicators as explanatory variables, and for evaluating survey-based probability forecasts.
Resumo:
Remotely sensed land cover maps are increasingly used as inputs into environmental simulation models whose outputs inform decisions and policy-making. Risks associated with these decisions are dependent on model output uncertainty, which is in turn affected by the uncertainty of land cover inputs. This article presents a method of quantifying the uncertainty that results from potential mis-classification in remotely sensed land cover maps. In addition to quantifying uncertainty in the classification of individual pixels in the map, we also address the important case where land cover maps have been upscaled to a coarser grid to suit the users’ needs and are reported as proportions of land cover type. The approach is Bayesian and incorporates several layers of modelling but is straightforward to implement. First, we incorporate data in the confusion matrix derived from an independent field survey, and discuss the appropriate way to model such data. Second, we account for spatial correlation in the true land cover map, using the remotely sensed map as a prior. Third, spatial correlation in the mis-classification characteristics is induced by modelling their variance. The result is that we are able to simulate posterior means and variances for individual sites and the entire map using a simple Monte Carlo algorithm. The method is applied to the Land Cover Map 2000 for the region of England and Wales, a map used as an input into a current dynamic carbon flux model.
Resumo:
The main uncertainty in anthropogenic forcing of the Earth’s climate stems from pollution aerosols, particularly their ‘‘indirect effect’’ whereby aerosols modify cloud properties. We develop a new methodology to derive a measurement-based estimate using almost exclusively information from an Earth radiation budget instrument (CERES) and a radiometer (MODIS). We derive a statistical relationship between planetary albedo and cloud properties, and, further, between the cloud properties and column aerosol concentration. Combining these relationships with a data set of satellite-derived anthropogenic aerosol fraction, we estimate an anthropogenic radiative forcing of �-0.9 ± 0.4 Wm�-2 for the aerosol direct effect and of �-0.2 ± 0.1 Wm�-2 for the cloud albedo effect. Because of uncertainties in both satellite data and the method, the uncertainty of this result is likely larger than the values given here which correspond only to the quantifiable error estimates. The results nevertheless indicate that current global climate models may overestimate the cloud albedo effect.
Resumo:
Aerosols affect the Earth's energy budget directly by scattering and absorbing radiation and indirectly by acting as cloud condensation nuclei and, thereby, affecting cloud properties. However, large uncertainties exist in current estimates of aerosol forcing because of incomplete knowledge concerning the distribution and the physical and chemical properties of aerosols as well as aerosol-cloud interactions. In recent years, a great deal of effort has gone into improving measurements and datasets. It is thus feasible to shift the estimates of aerosol forcing from largely model-based to increasingly measurement-based. Our goal is to assess current observational capabilities and identify uncertainties in the aerosol direct forcing through comparisons of different methods with independent sources of uncertainties. Here we assess the aerosol optical depth (τ), direct radiative effect (DRE) by natural and anthropogenic aerosols, and direct climate forcing (DCF) by anthropogenic aerosols, focusing on satellite and ground-based measurements supplemented by global chemical transport model (CTM) simulations. The multi-spectral MODIS measures global distributions of aerosol optical depth (τ) on a daily scale, with a high accuracy of ±0.03±0.05τ over ocean. The annual average τ is about 0.14 over global ocean, of which about 21%±7% is contributed by human activities, as estimated by MODIS fine-mode fraction. The multi-angle MISR derives an annual average AOD of 0.23 over global land with an uncertainty of ~20% or ±0.05. These high-accuracy aerosol products and broadband flux measurements from CERES make it feasible to obtain observational constraints for the aerosol direct effect, especially over global the ocean. A number of measurement-based approaches estimate the clear-sky DRE (on solar radiation) at the top-of-atmosphere (TOA) to be about -5.5±0.2 Wm-2 (median ± standard error from various methods) over the global ocean. Accounting for thin cirrus contamination of the satellite derived aerosol field will reduce the TOA DRE to -5.0 Wm-2. Because of a lack of measurements of aerosol absorption and difficulty in characterizing land surface reflection, estimates of DRE over land and at the ocean surface are currently realized through a combination of satellite retrievals, surface measurements, and model simulations, and are less constrained. Over the oceans the surface DRE is estimated to be -8.8±0.7 Wm-2. Over land, an integration of satellite retrievals and model simulations derives a DRE of -4.9±0.7 Wm-2 and -11.8±1.9 Wm-2 at the TOA and surface, respectively. CTM simulations derive a wide range of DRE estimates that on average are smaller than the measurement-based DRE by about 30-40%, even after accounting for thin cirrus and cloud contamination. A number of issues remain. Current estimates of the aerosol direct effect over land are poorly constrained. Uncertainties of DRE estimates are also larger on regional scales than on a global scale and large discrepancies exist between different approaches. The characterization of aerosol absorption and vertical distribution remains challenging. The aerosol direct effect in the thermal infrared range and in cloudy conditions remains relatively unexplored and quite uncertain, because of a lack of global systematic aerosol vertical profile measurements. A coordinated research strategy needs to be developed for integration and assimilation of satellite measurements into models to constrain model simulations. Enhanced measurement capabilities in the next few years and high-level scientific cooperation will further advance our knowledge.
Resumo:
Sixteen years (1994 – 2009) of ozone profiling by ozonesondes at Valentia Meteorological and Geophysical Observatory, Ireland (51.94° N, 10.23° W) along with a co-located MkIV Brewer spectrophotometer for the period 1993–2009 are analyzed. Simple and multiple linear regression methods are used to infer the recent trend, if any, in stratospheric column ozone over the station. The decadal trend from 1994 to 2010 is also calculated from the monthly mean data of Brewer and column ozone data derived from satellite observations. Both of these show a 1.5 % increase per decade during this period with an uncertainty of about ±0.25 %. Monthly mean data for March show a much stronger trend of ~ 4.8 % increase per decade for both ozonesonde and Brewer data. The ozone profile is divided between three vertical slots of 0–15 km, 15–26 km, and 26 km to the top of the atmosphere and a 11-year running average is calculated. Ozone values for the month of March only are observed to increase at each level with a maximum change of +9.2 ± 3.2 % per decade (between years 1994 and 2009) being observed in the vertical region from 15 to 26 km. In the tropospheric region from 0 to 15 km, the trend is positive but with a poor statistical significance. However, for the top level of above 26 km the trend is significantly positive at about 4 % per decade. The March integrated ozonesonde column ozone during this period is found to increase at a rate of ~6.6 % per decade compared with the Brewer and satellite positive trends of ~5 % per decade.
Resumo:
We present a method of simulating both the avalanche and surge components of pyroclastic flows generated by lava collapsing from a growing Pelean dome. This is used to successfully model the pyroclastic flows generated on 12 May 1996 by the Soufriere Hills volcano, Montserrat. In simulating the avalanche component we use a simple 3-fold parameterisation of flow acceleration for which we choose values using an inverse method. The surge component is simulated by a 1D hydraulic balance of sedimentation of clasts and entrainment of air away from the avalanche source. We show how multiple simulations based on uncertainty of the starting conditions and parameters, specifically location and size (mass flux), could be used to map hazard zones.