134 resultados para lead-time structure
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
In a recent study, Williams introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy. In the present paper, the effects of the modification are comprehensively evaluated in the Simplified Parameterizations, Primitive Equation Dynamics (SPEEDY) atmospheric general circulation model. First, the authors search for significant changes in the monthly climatology due to the introduction of the new filter. After testing both at the local level and at the field level, no significant changes are found, which is advantageous in the sense that the new scheme does not require a retuning of the parameterized model physics. Second, the authors examine whether the new filter improves the skill of short- and medium-term forecasts. January 1982 data from the NCEP–NCAR reanalysis are used to evaluate the forecast skill. Improvements are found in all the model variables (except the relative humidity, which is hardly changed). The improvements increase with lead time and are especially evident in medium-range forecasts (96–144 h). For example, in tropical surface pressure predictions, 5-day forecasts made using the RAW filter have approximately the same skill as 4-day forecasts made using the RA filter. The results of this work are encouraging for the implementation of the RAW filter in other models currently using the RA filter.
Resumo:
A key strategy to improve the skill of quantitative predictions of precipitation, as well as hazardous weather such as severe thunderstorms and flash floods is to exploit the use of observations of convective activity (e.g. from radar). In this paper, a convection-permitting ensemble prediction system (EPS) aimed at addressing the problems of forecasting localized weather events with relatively short predictability time scale and based on a 1.5 km grid-length version of the Met Office Unified Model is presented. Particular attention is given to the impact of using predicted observations of radar-derived precipitation intensity in the ensemble transform Kalman filter (ETKF) used within the EPS. Our initial results based on the use of a 24-member ensemble of forecasts for two summer case studies show that the convective-scale EPS produces fairly reliable forecasts of temperature, horizontal winds and relative humidity at 1 h lead time, as evident from the inspection of rank histograms. On the other hand, the rank histograms seem also to show that the EPS generates too much spread for forecasts of (i) surface pressure and (ii) surface precipitation intensity. These may indicate that for (i) the value of surface pressure observation error standard deviation used to generate surface pressure rank histograms is too large and for (ii) may be the result of non-Gaussian precipitation observation errors. However, further investigations are needed to better understand these findings. Finally, the inclusion of predicted observations of precipitation from radar in the 24-member EPS considered in this paper does not seem to improve the 1-h lead time forecast skill.
Resumo:
The consistency of ensemble forecasts from three global medium-range prediction systems with the observed transition behaviour of a three-cluster model of the North Atlantic eddy-driven jet is examined. The three clusters consist of a mid jet cluster taken to represent an undisturbed jet and south and north jet clusters representing southward and northward shifts of the jet. The ensemble forecasts span a period of three extended winters (October–February) from October 2007–February 2010. The mean probabilities of transitions between the clusters calculated from the ensemble forecasts are compared with those calculated from a 23-extended-winter climatology taken from the European Centre for Medium-Range Weather Forecasts 40-Year Re-analysis (ERA40) dataset. No evidence of a drift with increasing lead time of the ensemble forecast transition probabilities towards values inconsistent with the 23-extended-winter climatology is found. The ensemble forecasts of transition probabilities are found to have positive Brier Skill at 15 day lead times. It is found that for the three-extended-winter forecast set, probabilistic forecasts initialized in the north jet cluster are generally less skilful than those initialized in the other clusters. This is consistent with the shorter persistence time-scale of the north jet cluster observed in the ERA40 23-extended-winter climatology. Copyright © 2011 Royal Meteorological Society
Resumo:
The initial condition effect on climate prediction skill over a 2-year hindcast time-scale has been assessed from ensemble HadCM3 climate model runs using anomaly initialization over the period 1990–2001, and making comparisons with runs without initialization (equivalent to climatological conditions), and to anomaly persistence. It is shown that the assimilation improves the prediction skill in the first year globally, and in a number of limited areas out into the second year. Skill in hindcasting surface air temperature anomalies is most marked over ocean areas, and is coincident with areas of high sea surface temperature and ocean heat content skill. Skill improvement over land areas is much more limited but is still detectable in some cases. We found little difference in the skill of hindcasts using three different sets of ocean initial conditions, and we obtained the best results by combining these to form a grand ensemble hindcast set. Results are also compared with the idealized predictability studies of Collins (Clim. Dynam. 2002; 19: 671–692), which used the same model. The maximum lead time for which initialization gives enhanced skill over runs without initialization varies in different regions but is very similar to lead times found in the idealized studies, therefore strongly supporting the process representation in the model as well as its use for operational predictions. The limited 12-year period of the study, however, means that the regional details of model skill should probably be further assessed under a wider range of observational conditions.
Resumo:
Aim: A nested case-control discovery study was undertaken 10 test whether information within the serum peptidome can improve on the utility of CA125 for early ovarian cancer detection. Materials and Methods: High-throughput matrix-assisted laser desorption ionisation mass spectrometry (MALDI-MS) was used to profile 295 serum samples from women pre-dating their ovarian cancer diagnosis and from 585 matched control samples. Classification rules incorporating CA125 and MS peak intensities were tested for discriminating ability. Results: Two peaks were found which in combination with CA125 discriminated cases from controls up to 15 and 11 months before diagnosis, respectively, and earlier than using CA125 alone. One peak was identified as connective tissue-activating peptide III (CTAPIII), whilst the other was putatively identified as platelet factor 4 (PF4). ELISA data supported the down-regulation of PF4 in early cancer cases. Conclusion: Serum peptide information with CA125 improves lead time for early detection of ovarian cancer. The candidate markers are platelet-derived chemokines, suggesting a link between platelet function and tumour development.
Resumo:
Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.
Resumo:
The mechanisms involved in Atlantic meridional overturning circulation (AMOC) decadal variability and predictability over the last 50 years are analysed in the IPSL–CM5A–LR model using historical and initialised simulations. The initialisation procedure only uses nudging towards sea surface temperature anomalies with a physically based restoring coefficient. When compared to two independent AMOC reconstructions, both the historical and nudged ensemble simulations exhibit skill at reproducing AMOC variations from 1977 onwards, and in particular two maxima occurring respectively around 1978 and 1997. We argue that one source of skill is related to the large Mount Agung volcanic eruption starting in 1963, which reset an internal 20-year variability cycle in the North Atlantic in the model. This cycle involves the East Greenland Current intensity, and advection of active tracers along the subpolar gyre, which leads to an AMOC maximum around 15 years after the Mount Agung eruption. The 1997 maximum occurs approximately 20 years after the former one. The nudged simulations better reproduce this second maximum than the historical simulations. This is due to the initialisation of a cooling of the convection sites in the 1980s under the effect of a persistent North Atlantic oscillation (NAO) positive phase, a feature not captured in the historical simulations. Hence we argue that the 20-year cycle excited by the 1963 Mount Agung eruption together with the NAO forcing both contributed to the 1990s AMOC maximum. These results support the existence of a 20-year cycle in the North Atlantic in the observations. Hindcasts following the CMIP5 protocol are launched from a nudged simulation every 5 years for the 1960–2005 period. They exhibit significant correlation skill score as compared to an independent reconstruction of the AMOC from 4-year lead-time average. This encouraging result is accompanied by increased correlation skills in reproducing the observed 2-m air temperature in the bordering regions of the North Atlantic as compared to non-initialized simulations. To a lesser extent, predicted precipitation tends to correlate with the nudged simulation in the tropical Atlantic. We argue that this skill is due to the initialisation and predictability of the AMOC in the present prediction system. The mechanisms evidenced here support the idea of volcanic eruptions as a pacemaker for internal variability of the AMOC. Together with the existence of a 20-year cycle in the North Atlantic they propose a novel and complementary explanation for the AMOC variations over the last 50 years.
Resumo:
Drought is a global problem that has far-reaching impacts and especially 47 on vulnerable populations in developing regions. This paper highlights the need for a Global Drought Early Warning System (GDEWS), the elements that constitute its underlying framework (GDEWF) and the recent progress made towards its development. Many countries lack drought monitoring systems, as well as the capacity to respond via appropriate political, institutional and technological frameworks, and these have inhibited the development of integrated drought management plans or early warning systems. The GDEWS will provide a source of drought tools and products via the GDEWF for countries and regions to develop tailored drought early warning systems for their own users. A key goal of a GDEWS is to maximize the lead time for early warning, allowing drought managers and disaster coordinators more time to put mitigation measures in place to reduce the vulnerability to drought. To address this, the GDEWF will take both a top-down approach to provide global real-time drought monitoring and seasonal forecasting, and a bottom-up approach that builds upon existing national and regional systems to provide continental to global coverage. A number of challenges must be overcome, however, before a GDEWS can become a reality, including the lack of in-situ measurement networks and modest seasonal forecast skill in many regions, and the lack of infrastructure to translate data into useable information. A set of international partners, through a series of recent workshops and evolving collaborations, has made progress towards meeting these challenges and developing a global system.
Resumo:
A fingerprint method for detecting anthropogenic climate change is applied to new simulations with a coupled ocean-atmosphere general circulation model (CGCM) forced by increasing concentrations of greenhouse gases and aerosols covering the years 1880 to 2050. In addition to the anthropogenic climate change signal, the space-time structure of the natural climate variability for near-surface temperatures is estimated from instrumental data over the last 134 years and two 1000 year simulations with CGCMs. The estimates are compared with paleoclimate data over 570 years. The space-time information on both the signal and the noise is used to maximize the signal-to-noise ratio of a detection variable obtained by applying an optimal filter (fingerprint) to the observed data. The inclusion of aerosols slows the predicted future warming. The probability that the observed increase in near-surface temperatures in recent decades is of natural origin is estimated to be less than 5%. However, this number is dependent on the estimated natural variability level, which is still subject to some uncertainty.
Resumo:
Wine production is largely governed by atmospheric conditions, such as air temperature and precipitation, together with soil management and viticultural/enological practices. Therefore, anthropogenic climate change is likely to have important impacts on the winemaking sector worldwide. An important winemaking region is the Portuguese Douro Valley, which is known by its world-famous Port Wine. The identification of robust relationships between atmospheric factors and wine parameters is of great relevance for the region. A multivariate linear regression analysis of a long wine production series (1932–2010) reveals that high rainfall and cool temperatures during budburst, shoot and inflorescence development (February-March) and warm temperatures during flowering and berry development (May) are generally favourable to high production. The probabilities of occurrence of three production categories (low, normal and high) are also modelled using multinomial logistic regression. Results show that both statistical models are valuable tools for predicting the production in a given year with a lead time of 3–4 months prior to harvest. These statistical models are applied to an ensemble of 16 regional climate model experiments following the SRES A1B scenario to estimate possible future changes. Wine production is projected to increase by about 10 % by the end of the 21st century, while the occurrence of high production years is expected to increase from 25 % to over 60 %. Nevertheless, further model development will be needed to include other aspects that may shape production in the future. In particular, the rising heat stress and/or changes in ripening conditions could limit the projected production increase in future decades.
Resumo:
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Resumo:
In this paper the properties of a hydro-meteorological forecasting system for forecasting river flows have been analysed using a probabilistic forecast convergence score (FCS). The focus on fixed event forecasts provides a forecaster's approach to system behaviour and adds an important perspective to the suite of forecast verification tools commonly used in this field. A low FCS indicates a more consistent forecast. It can be demonstrated that the FCS annual maximum decreases over the last 10 years. With lead time, the FCS of the ensemble forecast decreases whereas the control and high resolution forecast increase. The FCS is influenced by the lead time, threshold and catchment size and location. It indicates that one should use seasonality based decision rules to issue flood warnings.
Resumo:
We present a case study using the TIGGE database for flood warning in the Upper Huai catchment (ca. 30 672 km2). TIGGE ensemble forecasts from 6 meteorological centres with 10-day lead time were extracted and disaggregated to drive the Xinanjiang model to forecast discharges for flood events in July-September 2008. The results demonstrated satisfactory flood forecasting skills with clear signals of floods up to 10 days in advance. The forecasts occasionally show discrepancies both in time and space. Forecasting quality could potentially be improved by using temporal and spatial corrections of the forecasted precipitation.
Resumo:
In projections of twenty-first century climate, Arctic sea ice declines and at the same time exhibits strong interannual anomalies. Here, we investigate the potential to predict these strong sea-ice anomalies under a perfect-model assumption, using the Max-Planck-Institute Earth System Model in the same setup as in the Coupled Model Intercomparison Project Phase 5 (CMIP5). We study two cases of strong negative sea-ice anomalies: a 5-year-long anomaly for present-day conditions, and a 10-year-long anomaly for conditions projected for the middle of the twenty-first century. We treat these anomalies in the CMIP5 projections as the truth, and use exactly the same model configuration for predictions of this synthetic truth. We start ensemble predictions at different times during the anomalies, considering lagged-perfect and sea-ice-assimilated initial conditions. We find that the onset and amplitude of the interannual anomalies are not predictable. However, the further deepening of the anomaly can be predicted for typically 1 year lead time if predictions start after the onset but before the maximal amplitude of the anomaly. The magnitude of an extremely low summer sea-ice minimum is hard to predict: the skill of the prediction ensemble is not better than a damped-persistence forecast for lead times of more than a few months, and is not better than a climatology forecast for lead times of two or more years. Predictions of the present-day anomaly are more skillful than predictions of the mid-century anomaly. Predictions using sea-ice-assimilated initial conditions are competitive with those using lagged-perfect initial conditions for lead times of a year or less, but yield degraded skill for longer lead times. The results presented here suggest that there is limited prospect of predicting the large interannual sea-ice anomalies expected to occur throughout the twenty-first century.