957 resultados para Climate, Dengue, Models, Projection, Scenarios
Resumo:
Three prominent quasi-global patterns of variability and change are observed using the Met Office's sea surface temperature (SST) analysis and almost independent night marine air temperature analysis. The first is a global warming signal that is very highly correlated with global mean SST. The second is a decadal to multidecadal fluctuation with some geographical similarity to the El Niño–Southern Oscillation (ENSO). It is associated with the Pacific Decadal Oscillation (PDO), and its Pacific-wide manifestation has been termed the Interdecadal Pacific Oscillation (IPO). We present model investigations of the relationship between the IPO and ENSO. The third mode is an interhemispheric variation on multidecadal timescales which, in view of climate model experiments, is likely to be at least partly due to natural variations in the thermohaline circulation. Observed climatic impacts of this mode also appear in model simulations. Smaller-scale, regional atmospheric phenomena also affect climate on decadal to interdecadal timescales. We concentrate on one such mode, the winter North Atlantic Oscillation (NAO). This shows strong decadal to interdecadal variability and a correspondingly strong influence on surface climate variability which is largely additional to the effects of recent regional anthropogenic climate change. The winter NAO is likely influenced by both SST forcing and stratospheric variability. A full understanding of decadal changes in the NAO and European winter climate may require a detailed representation of the stratosphere that is hitherto missing in the major climate models used to study climate change.
Resumo:
1. It has been postulated that climate warming may pose the greatest threat species in the tropics, where ectotherms have evolved more thermal specialist physiologies. Although species could rapidly respond to environmental change through adaptation, little is known about the potential for thermal adaptation, especially in tropical species. 2. In the light of the limited empirical evidence available and predictions from mutation-selection theory, we might expect tropical ectotherms to have limited genetic variance to enable adaptation. However, as a consequence of thermodynamic constraints, we might expect this disadvantage to be at least partially offset by a fitness advantage, that is, the ‘hotter-is-better’ hypothesis. 3. Using an established quantitative genetics model and metabolic scaling relationships, we integrate the consequences of the opposing forces of thermal specialization and thermodynamic constraints on adaptive potential by evaluating extinction risk under climate warming. We conclude that the potential advantage of a higher maximal development rate can in theory more than offset the potential disadvantage of lower genetic variance associated with a thermal specialist strategy. 4. Quantitative estimates of extinction risk are fundamentally very sensitive to estimates of generation time and genetic variance. However, our qualitative conclusion that the relative risk of extinction is likely to be lower for tropical species than for temperate species is robust to assumptions regarding the effects of effective population size, mutation rate and birth rate per capita. 5. With a view to improving ecological forecasts, we use this modelling framework to review the sensitivity of our predictions to the model’s underpinning theoretical assumptions and the empirical basis of macroecological patterns that suggest thermal specialization and fitness increase towards the tropics. We conclude by suggesting priority areas for further empirical research.
Resumo:
The evidence provided by modelled assessments of future climate impact on flooding is fundamental to water resources and flood risk decision making. Impact models usually rely on climate projections from global and regional climate models (GCM/RCMs). However, challenges in representing precipitation events at catchment-scale resolution mean that decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs. Here the impacts on projected high flows of differing ensemble approaches and application of Model Output Statistics to RCM precipitation are evaluated while assessing climate change impact on flood hazard in the Upper Severn catchment in the UK. Various ensemble projections are used together with the HBV hydrological model with direct forcing and also compared to a response surface technique. We consider an ensemble of single-model RCM projections from the current UK Climate Projections (UKCP09); multi-model ensemble RCM projections from the European Union's FP6 ‘ENSEMBLES’ project; and a joint probability distribution of precipitation and temperature from a GCM-based perturbed physics ensemble. The ensemble distribution of results show that flood hazard in the Upper Severn is likely to increase compared to present conditions, but the study highlights the differences between the results from different ensemble methods and the strong assumptions made in using Model Output Statistics to produce the estimates of future river discharge. The results underline the challenges in using the current generation of RCMs for local climate impact studies on flooding. Copyright © 2012 Royal Meteorological Society
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The World Weather Research Programme (WWRP) and the World Climate Research Programme (WCRP) have identified collaborations and scientific priorities to accelerate advances in analysis and prediction at subseasonal-to-seasonal time scales, which include i) advancing knowledge of mesoscale–planetary-scale interactions and their prediction; ii) developing high-resolution global–regional climate simulations, with advanced representation of physical processes, to improve the predictive skill of subseasonal and seasonal variability of high-impact events, such as seasonal droughts and floods, blocking, and tropical and extratropical cyclones; iii) contributing to the improvement of data assimilation methods for monitoring and predicting used in coupled ocean–atmosphere–land and Earth system models; and iv) developing and transferring diagnostic and prognostic information tailored to socioeconomic decision making. The document puts forward specific underpinning research, linkage, and requirements necessary to achieve the goals of the proposed collaboration.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The extra-tropical response to El Niño in configurations of a coupled model with increased horizontal resolution in the oceanic component is shown to be more realistic than in configurations with a low resolution oceanic component. This general conclusion is independent of the atmospheric resolution. Resolving small-scale processes in the ocean produces a more realistic oceanic mean state, with a reduced cold tongue bias, which in turn allows the atmospheric model component to be forced more realistically. A realistic atmospheric basic state is critical in order to represent Rossby wave propagation in response to El Niño, and hence the extra-tropical response to El Niño. Through the use of high and low resolution configurations of the forced atmospheric-only model component we show that, in isolation, atmospheric resolution does not significantly affect the simulation of the extra-tropical response to El Niño. It is demonstrated, through perturbations to the SST forcing of the atmospheric model component, that biases in the climatological SST field typical of coupled model configurations with low oceanic resolution can account for the erroneous atmospheric basic state seen in these coupled model configurations. These results highlight the importance of resolving small-scale oceanic processes in producing a realistic large-scale mean climate in coupled models, and suggest that it might may be possible to “squeeze out” valuable extra performance from coupled models through increases to oceanic resolution alone.
Resumo:
The Chartered Institute of Building Service Engineers (CIBSE) produced a technical memorandum (TM36) presenting research on future climate impacting building energy use and thermal comfort. One climate projection for each of four CO2 emissions scenario were used in TM36, so providing a deterministic outlook. As part of the UK Climate Impacts Programme (UKCIP) probabilistic climate projections are being studied in relation to building energy simulation techniques. Including uncertainty in climate projections is considered an important advance to climate impacts modelling and is included in the latest UKCIP data (UKCP09). Incorporating the stochastic nature of these new climate projections in building energy modelling requires a significant increase in data handling and careful statistical interpretation of the results to provide meaningful conclusions. This paper compares the results from building energy simulations when applying deterministic and probabilistic climate data. This is based on two case study buildings: (i) a mixed-mode office building with exposed thermal mass and (ii) a mechanically ventilated, light-weight office building. Building (i) represents an energy efficient building design that provides passive and active measures to maintain thermal comfort. Building (ii) relies entirely on mechanical means for heating and cooling, with its light-weight construction raising concern over increased cooling loads in a warmer climate. Devising an effective probabilistic approach highlighted greater uncertainty in predicting building performance, depending on the type of building modelled and the performance factors under consideration. Results indicate that the range of calculated quantities depends not only on the building type but is strongly dependent on the performance parameters that are of interest. Uncertainty is likely to be particularly marked with regard to thermal comfort in naturally ventilated buildings.
Resumo:
Ozone (O3) precursor emissions influence regional and global climate and air quality through changes in tropospheric O3 and oxidants, which also influence methane (CH4) and sulfate aerosols (SO42−). We examine changes in the tropospheric composition of O3, CH4, SO42− and global net radiative forcing (RF) for 20% reductions in global CH4 burden and in anthropogenic O3 precursor emissions (NOx, NMVOC, and CO) from four regions (East Asia, Europe and Northern Africa, North America, and South Asia) using the Task Force on Hemispheric Transport of Air Pollution Source-Receptor global chemical transport model (CTM) simulations, assessing uncertainty (mean ± 1 standard deviation) across multiple CTMs. We evaluate steady state O3 responses, including long-term feedbacks via CH4. With a radiative transfer model that includes greenhouse gases and the aerosol direct effect, we find that regional NOx reductions produce global, annually averaged positive net RFs (0.2 ± 0.6 to 1.7 ± 2 mWm−2/Tg N yr−1), with some variation among models. Negative net RFs result from reductions in global CH4 (−162.6 ± 2 mWm−2 for a change from 1760 to 1408 ppbv CH4) and regional NMVOC (−0.4 ± 0.2 to −0.7 ± 0.2 mWm−2/Tg C yr−1) and CO emissions (−0.13 ± 0.02 to −0.15 ± 0.02 mWm−2/Tg CO yr−1). Including the effect of O3 on CO2 uptake by vegetation likely makes these net RFs more negative by −1.9 to −5.2 mWm−2/Tg N yr−1, −0.2 to −0.7 mWm−2/Tg C yr−1, and −0.02 to −0.05 mWm−2/Tg CO yr−1. Net RF impacts reflect the distribution of concentration changes, where RF is affected locally by changes in SO42−, regionally to hemispherically by O3, and globally by CH4. Global annual average SO42− responses to oxidant changes range from 0.4 ± 2.6 to −1.9 ± 1.3 Gg for NOx reductions, 0.1 ± 1.2 to −0.9 ± 0.8 Gg for NMVOC reductions, and −0.09 ± 0.5 to −0.9 ± 0.8 Gg for CO reductions, suggesting additional research is needed. The 100-year global warming potentials (GWP100) are calculated for the global CH4 reduction (20.9 ± 3.7 without stratospheric O3 or water vapor, 24.2 ± 4.2 including those components), and for the regional NOx, NMVOC, and CO reductions (−18.7 ± 25.9 to −1.9 ± 8.7 for NOx, 4.8 ± 1.7 to 8.3 ± 1.9 for NMVOC, and 1.5 ± 0.4 to 1.7 ± 0.5 for CO). Variation in GWP100 for NOx, NMVOC, and CO suggests that regionally specific GWPs may be necessary and could support the inclusion of O3 precursors in future policies that address air quality and climate change simultaneously. Both global net RF and GWP100 are more sensitive to NOx and NMVOC reductions from South Asia than the other three regions.
Resumo:
By comparing annual and seasonal changes in precipitation over land and ocean since 1950 simulated by the CMIP5 (Coupled Model Intercomparison Project, phase 5) climate models in which natural and anthropogenic forcings have been included, we find that clear global-scale and regional-scale changes due to human influence are expected to have occurred over both land and ocean. These include moistening over northern high latitude land and ocean throughout all seasons and over the northern subtropical oceans during boreal winter. However we show that this signal of human influence is less distinct when considered over the relatively small area of land for which there are adequate observations to make assessments of multi-decadal scale trends. These results imply that extensive and significant changes in precipitation over the land and ocean may have already happened, even though, inadequacies in observations in some parts of the world make it difficult to identify conclusively such a human fingerprint on the global water cycle. In some regions and seasons, due to aliasing of different kinds of variability as a result of sub sampling by the sparse and changing observational coverage, observed trends appear to have been increased, underscoring the difficulties of interpreting the apparent magnitude of observed changes in precipitation.
Resumo:
An assessment of the fifth Coupled Models Intercomparison Project (CMIP5) models’ simulation of the near-surface westerly wind jet position and strength over the Atlantic, Indian and Pacific sectors of the Southern Ocean is presented. Compared with reanalysis climatologies there is an equatorward bias of 3.7° (inter-model standard deviation of ± 2.2°) in the ensemble mean position of the zonal mean jet. The ensemble mean strength is biased slightly too weak, with the largest biases over the Pacific sector (-1.6±1.1 m/s, 27 -22%). An analysis of atmosphere-only (AMIP) experiments indicates that 41% of the zonal mean position bias comes from coupling of the ocean/ice models to the atmosphere. The response to future emissions scenarios (RCP4.5 and RCP8.5) is characterized by two phases: (i) the period of most rapid ozone recovery (2000-2049) during which there is insignificant change in summer; and (ii) the period 2050-2098 during which RCP4.5 simulations show no significant change but RCP8.5 simulations show poleward shifts (0.30, 0.19 and 0.28°/decade over the Atlantic, Indian and Pacific sectors respectively), and increases in strength (0.06, 0.08 and 0.15 m/s/decade respectively). The models with larger equatorward position biases generally show larger poleward shifts (i.e. state dependence). This inter-model relationship is strongest over the Pacific sector (r=-0.89) and insignificant over the Atlantic sector (r=-0.50). However, an assessment of jet structure shows that over the Atlantic sector jet shift is significantly correlated with jet width whereas over the Pacific sector the distance between the sub-polar and sub-tropical westerly jets appears to be more important.
Resumo:
High spatial resolution environmental data gives us a better understanding of the environmental factors affecting plant distributions at fine spatial scales. However, large environmental datasets dramatically increase compute times and output species model size stimulating the need for an alternative computing solution. Cluster computing offers such a solution, by allowing both multiple plant species Environmental Niche Models (ENMs) and individual tiles of high spatial resolution models to be computed concurrently on the same compute cluster. We apply our methodology to a case study of 4,209 species of Mediterranean flora (around 17% of species believed present in the biome). We demonstrate a 16 times speed-up of ENM computation time when 16 CPUs were used on the compute cluster. Our custom Java ‘Merge’ and ‘Downsize’ programs reduce ENM output files sizes by 94%. The median 0.98 test AUC score of species ENMs is aided by various species occurrence data filtering techniques. Finally, by calculating the percentage change of individual grid cell values, we map the projected percentages of plant species vulnerable to climate change in the Mediterranean region between 1950–2000 and 2020.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
The extent and thickness of the Arctic sea ice cover has decreased dramatically in the past few decades with minima in sea ice extent in September 2005 and 2007. These minima have not been predicted in the IPCC AR4 report, suggesting that the sea ice component of climate models should more realistically represent the processes controlling the sea ice mass balance. One of the processes poorly represented in sea ice models is the formation and evolution of melt ponds. Melt ponds accumulate on the surface of sea ice from snow and sea ice melt and their presence reduces the albedo of the ice cover, leading to further melt. Toward the end of the melt season, melt ponds cover up to 50% of the sea ice surface. We have developed a melt pond evolution theory. Here, we have incorporated this melt pond theory into the Los Alamos CICE sea ice model, which has required us to include the refreezing of melt ponds. We present results showing that the presence, or otherwise, of a representation of melt ponds has a significant effect on the predicted sea ice thickness and extent. We also present a sensitivity study to uncertainty in the sea ice permeability, number of thickness categories in the model representation, meltwater redistribution scheme, and pond albedo. We conclude with a recommendation that our melt pond scheme is included in sea ice models, and the number of thickness categories should be increased and concentrated at lower thicknesses.
Resumo:
Systematic climate shifts have been linked to multidecadal variability in observed sea surface temperatures in the North Atlantic Ocean1. These links are extensive, influencing a range of climate processes such as hurricane activity2 and African Sahel3, 4, 5 and Amazonian5 droughts. The variability is distinct from historical global-mean temperature changes and is commonly attributed to natural ocean oscillations6, 7, 8, 9, 10. A number of studies have provided evidence that aerosols can influence long-term changes in sea surface temperatures11, 12, but climate models have so far failed to reproduce these interactions6, 9 and the role of aerosols in decadal variability remains unclear. Here we use a state-of-the-art Earth system climate model to show that aerosol emissions and periods of volcanic activity explain 76 per cent of the simulated multidecadal variance in detrended 1860–2005 North Atlantic sea surface temperatures. After 1950, simulated variability is within observational estimates; our estimates for 1910–1940 capture twice the warming of previous generation models but do not explain the entire observed trend. Other processes, such as ocean circulation, may also have contributed to variability in the early twentieth century. Mechanistically, we find that inclusion of aerosol–cloud microphysical effects, which were included in few previous multimodel ensembles, dominates the magnitude (80 per cent) and the spatial pattern of the total surface aerosol forcing in the North Atlantic. Our findings suggest that anthropogenic aerosol emissions influenced a range of societally important historical climate events such as peaks in hurricane activity and Sahel drought. Decadal-scale model predictions of regional Atlantic climate will probably be improved by incorporating aerosol–cloud microphysical interactions and estimates of future concentrations of aerosols, emissions of which are directly addressable by policy actions.