993 resultados para minimum surface air temperature
Resumo:
Aeolian dust modelling has improved significantly over the last ten years and many institutions now consistently model dust uplift, transport and deposition in general circulation models (GCMs). However, the representation of dust in GCMs is highly variable between modelling communities due to differences in the uplift schemes employed and the representation of the global circulation that subsequently leads to dust deflation. In this study two different uplift schemes are incorporated in the same GCM. This approach enables a clearer comparison of the dust uplift schemes themselves, without the added complexity of several different transport and deposition models. The global annual mean dust aerosol optical depths (at 550 nm) using two different dust uplift schemes were found to be 0.014 and 0.023—both lying within the estimates from the AeroCom project. However, the models also have appreciably different representations of the dust size distribution adjacent to the West African coast and very different deposition at various sites throughout the globe. The different dust uplift schemes were also capable of influencing the modelled circulation, surface air temperature, and precipitation despite the use of prescribed sea surface temperatures. This has important implications for the use of dust models in AMIP-style (Atmospheric Modelling Intercomparison Project) simulations and Earth-system modelling.
Resumo:
In the Coupled Model Intercomparison Project Phase 5 (CMIP5), the model-mean increase in global mean surface air temperature T under the 1pctCO2 scenario (atmospheric CO2 increasing at 1% yr−1) during the second doubling of CO2 is 40% larger than the transient climate response (TCR), i.e. the increase in T during the first doubling. We identify four possible contributory effects. First, the surface climate system loses heat less readily into the ocean beneath as the latter warms. The model spread in the thermal coupling between the upper and deep ocean largely explains the model spread in ocean heat uptake efficiency. Second, CO2 radiative forcing may rise more rapidly than logarithmically with CO2 concentration. Third, the climate feedback parameter may decline as the CO2 concentration rises. With CMIP5 data, we cannot distinguish the second and third possibilities. Fourth, the climate feedback parameter declines as time passes or T rises; in 1pctCO2, this effect is less important than the others. We find that T projected for the end of the twenty-first century correlates more highly with T at the time of quadrupled CO2 in 1pctCO2 than with the TCR, and we suggest that the TCR may be underestimated from observed climate change.
Resumo:
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Resumo:
The atmospheric response to an idealized decline in Arctic sea ice is investigated in a novel fully coupled climate model experiment. In this experiment two ensembles of single-year model integrations are performed starting on 1 April, the approximate start of the ice melt season. By perturbing the initial conditions of sea ice thickness (SIT), declines in both sea ice concentration and SIT, which result in sea ice distributions that are similar to the recent sea ice minima of 2007 and 2012, are induced. In the ice loss regions there are strong (~3 K) local increases in sea surface temperature (SST); additionally, there are remote increases in SST in the central North Pacific and subpolar gyre in the North Atlantic. Over the central Arctic there are increases in surface air temperature (SAT) of ~8 K due to increases in ocean–atmosphere heat fluxes. There are increases in SAT over continental North America that are in good agreement with recent changes as seen by reanalysis data. It is estimated that up to two-thirds of the observed increase in SAT in this region could be related to Arctic sea ice loss. In early summer there is a significant but weak atmospheric circulation response that projects onto the summer North Atlantic Oscillation (NAO). In early summer and early autumn there is an equatorward shift of the eddy-driven jet over the North Atlantic as a result of a reduction in the meridional temperature gradients. In winter there is no projection onto a particular phase of the NAO.
Resumo:
In both the observational record and atmosphere-ocean general circulation model (AOGCM) simulations of the last ∼∼ 150 years, short-lived negative radiative forcing due to volcanic aerosol, following explosive eruptions, causes sudden global-mean cooling of up to ∼∼ 0.3 K. This is about five times smaller than expected from the transient climate response parameter (TCRP, K of global-mean surface air temperature change per W m−2 of radiative forcing increase) evaluated under atmospheric CO2 concentration increasing at 1 % yr−1. Using the step model (Good et al. in Geophys Res Lett 38:L01703, 2011. doi:10.1029/2010GL045208), we confirm the previous finding (Held et al. in J Clim 23:2418–2427, 2010. doi:10.1175/2009JCLI3466.1) that the main reason for the discrepancy is the damping of the response to short-lived forcing by the thermal inertia of the upper ocean. Although the step model includes this effect, it still overestimates the volcanic cooling simulated by AOGCMs by about 60 %. We show that this remaining discrepancy can be explained by the magnitude of the volcanic forcing, which may be smaller in AOGCMs (by 30 % for the HadCM3 AOGCM) than in off-line calculations that do not account for rapid cloud adjustment, and the climate sensitivity parameter, which may be smaller than for increasing CO2 (40 % smaller than for 4 × CO2 in HadCM3).
Resumo:
This paper describes the development and basic evaluation of decadal predictions produced using the HiGEM coupled climate model. HiGEM is a higher resolution version of the HadGEM1 Met Office Unified Model. The horizontal resolution in HiGEM has been increased to 1.25◦ × 0.83◦ in longitude and latitude for the atmosphere, and 1/3◦ × 1/3◦ globally for the ocean. The HiGEM decadal predictions are initialised using an anomaly assimilation scheme that relaxes anomalies of ocean temperature and salinity to observed anomalies. 10 year hindcasts are produced for 10 start dates (1960, 1965,..., 2000, 2005). To determine the relative contributions to prediction skill from initial conditions and external forcing, the HiGEM decadal predictions are compared to uninitialised HiGEM transient experiments. The HiGEM decadal predictions have substantial skill for predictions of annual mean surface air temperature and 100 m upper ocean temperature. For lead times up to 10 years, anomaly correlations (ACC) over large areas of the North Atlantic Ocean, the Western Pacific Ocean and the Indian Ocean exceed values of 0.6. Initialisation of the HiGEM decadal predictions significantly increases skill over regions of the Atlantic Ocean,the Maritime Continent and regions of the subtropical North and South Pacific Ocean. In particular, HiGEM produces skillful predictions of the North Atlantic subpolar gyre for up to 4 years lead time (with ACC > 0.7), which are significantly larger than the uninitialised HiGEM transient experiments.
Resumo:
As the climate warms, heat waves (HW) are projected to be more intense and to last longer, with serious implications for public health. Urban residents face higher health risks because urban heat islands (UHIs) exacerbate HW conditions. One strategy to mitigate negative impacts of urban thermal stress is the installation of green roofs (GRs) given their evaporative cooling effect. However, the effectiveness of GRs and the mechanisms by which they have an effect at the scale of entire cities are still largely unknown. The Greater Beijing Region (GBR) is modeled for a HW scenario with the Weather Research and Forecasting (WRF) model coupled with a state-of-the-art urban canopy model (PUCM) to examine the effectiveness of GRs. The results suggest GR would decrease near-surface air temperature (ΔT2max = 2.5 K) and wind speed (ΔUV10max = 1.0 m s-1) but increase atmospheric humidity (ΔQ2max = 1.3 g kg-1). GRs are simulated to lessen the overall thermal stress as indicated by apparent temperature (ΔAT2max = 1.7 °C). The modifications by GRs scale almost linearly with the fraction of the surface they cover. Investigation of the surface-atmosphere interactions indicate that GRs with plentiful soil moisture dissipate more of the surface energy as latent heat flux and subsequently inhibit the development of the daytime planetary boundary layer (PBL). This causes the atmospheric heating through entrainment at the PBL top to be decreased. Additionally, urban GRs modify regional circulation regimes leading to decreased advective heating under HW.
Resumo:
Regional climate change projections for the last half of the twenty-first century have been produced for South America, as part of the CREAS (Cenarios REgionalizados de Clima Futuro da America do Sul) regional project. Three regional climate models RCMs (Eta CCS, RegCM3 and HadRM3P) were nested within the HadAM3P global model. The simulations cover a 30-year period representing present climate (1961-1990) and projections for the IPCC A2 high emission scenario for 2071-2100. The focus was on the changes in the mean circulation and surface variables, in particular, surface air temperature and precipitation. There is a consistent pattern of changes in circulation, rainfall and temperatures as depicted by the three models. The HadRM3P shows intensification and a more southward position of the subtropical Pacific high, while a pattern of intensification/weakening during summer/winter is projected by the Eta CCS/RegCM3. There is a tendency for a weakening of the subtropical westerly jet from the Eta CCS and HadRM3P, consistent with other studies. There are indications that regions such of Northeast Brazil and central-eastern and southern Amazonia may experience rainfall deficiency in the future, while the Northwest coast of Peru-Ecuador and northern Argentina may experience rainfall excesses in a warmer future, and these changes may vary with the seasons. The three models show warming in the A2 scenario stronger in the tropical region, especially in the 5A degrees N-15A degrees S band, both in summer and especially in winter, reaching up to 6-8A degrees C warmer than in the present. In southern South America, the warming in summer varies between 2 and 4A degrees C and in winter between 3 and 5A degrees C in the same region from the 3 models. These changes are consistent with changes in low level circulation from the models, and they are comparable with changes in rainfall and temperature extremes reported elsewhere. In summary, some aspects of projected future climate change are quite robust across this set of model runs for some regions, as the Northwest coast of Peru-Ecuador, northern Argentina, Eastern Amazonia and Northeast Brazil, whereas for other regions they are less robust as in Pantanal region of West Central and southeastern Brazil.
Resumo:
The variability of the Atlantic meridional overturing circulation (AMOC) strength is investigated in control experiments and in transient simulations of up to the last millennium using the low-resolution Community Climate System Model version 3. In the transient simulations the AMOC exhibits enhanced low-frequency variability that is mainly caused by infrequent transitions between two semi-stable circulation states which amount to a 10 percent change of the maximum overturning. One transition is also found in a control experiment, but the time-varying external forcing significantly increases the probability of the occurrence of such events though not having a direct, linear impact on the AMOC. The transition from a high to a low AMOC state starts with a reduction of the convection in the Labrador and Irminger Seas and goes along with a changed barotropic circulation of both gyres in the North Atlantic and a gradual strengthening of the convection in the Greenland-Iceland-Norwegian (GIN) Seas. In contrast, the transition from a weak to a strong overturning is induced by decreased mixing in the GIN Seas. As a consequence of the transition, regional sea surface temperature (SST) anomalies are found in the midlatitude North Atlantic and in the convection regions with an amplitude of up to 3 K. The atmospheric response to the SST forcing associated with the transition indicates a significant impact on the Scandinavian surface air temperature (SAT) in the order of 1 K. Thus, the changes of the ocean circulation make a major contribution to the Scandinavian SAT variability in the last millennium.
Resumo:
This paper presents a comparison of principal component (PC) regression and regularized expectation maximization (RegEM) to reconstruct European summer and winter surface air temperature over the past millennium. Reconstruction is performed within a surrogate climate using the National Center for Atmospheric Research (NCAR) Climate System Model (CSM) 1.4 and the climate model ECHO-G 4, assuming different white and red noise scenarios to define the distortion of pseudoproxy series. We show how sensitivity tests lead to valuable “a priori” information that provides a basis for improving real world proxy reconstructions. Our results emphasize the need to carefully test and evaluate reconstruction techniques with respect to the temporal resolution and the spatial scale they are applied to. Furthermore, we demonstrate that uncertainties inherent to the predictand and predictor data have to be more rigorously taken into account. The comparison of the two statistical techniques, in the specific experimental setting presented here, indicates that more skilful results are achieved with RegEM as low frequency variability is better preserved. We further detect seasonal differences in reconstruction skill for the continental scale, as e.g. the target temperature average is more adequately reconstructed for summer than for winter. For the specific predictor network given in this paper, both techniques underestimate the target temperature variations to an increasing extent as more noise is added to the signal, albeit RegEM less than with PC regression. We conclude that climate field reconstruction techniques can be improved and need to be further optimized in future applications.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
This paper summarizes the results of an intercomparison project with Earth System Models of Intermediate Complexity (EMICs) undertaken in support of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). The focus is on long-term climate projections designed to 1) quantify the climate change commitment of different radiative forcing trajectories and 2) explore the extent to which climate change is reversible on human time scales. All commitment simulations follow the four representative concentration pathways (RCPs) and their extensions to year 2300. Most EMICs simulate substantial surface air temperature and thermosteric sea level rise commitment following stabilization of the atmospheric composition at year-2300 levels. The meridional overturning circulation (MOC) is weakened temporarily and recovers to near-preindustrial values in most models for RCPs 2.6-6.0. The MOC weakening is more persistent for RCP8.5. Elimination of anthropogenic CO2 emissions after 2300 results in slowly decreasing atmospheric CO2 concentrations. At year 3000 atmospheric CO2 is still at more than half its year-2300 level in all EMICs for RCPs 4.5-8.5. Surface air temperature remains constant or decreases slightly and thermosteric sea level rise continues for centuries after elimination of CO2 emissions in all EMICs. Restoration of atmospheric CO2 from RCP to preindustrial levels over 100-1000 years requires large artificial removal of CO2 from the atmosphere and does not result in the simultaneous return to preindustrial climate conditions, as surface air temperature and sea level response exhibit a substantial time lag relative to atmospheric CO2.
Resumo:
The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.
Resumo:
Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.