36 resultados para NEAR-TERM PARAPLACENTA

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of aerosol emissions for near term climate projections is investigated by analysing simulations with the HadGEM2-ES model under two different emissions scenarios: RCP2.6 and RCP4.5. It is shown that the near term warming projected under RCP2.6 is greater than under RCP4.5, even though the greenhouse gas forcing is lower. Rapid and substantial reductions in sulphate aerosol emissions due to a reduction of coal burning in RCP2.6 lead to a reduction in the negative shortwave forcing due to aerosol direct and indirect effects. Indirect effects play an important role over the northern hemisphere oceans, especially the subtropical northeastern Pacific where an anomaly of 5-10\,Wm$^{-2}$ develops. The pattern of surface temperature change is consistent with the expected response to this surface radiation anomaly, whilst also exhibiting features that reflect redistribution of energy, and feedbacks, within the climate system. These results demonstrate the importance of aerosol emissions as a key source of uncertainty in near term projections of global and regional climate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the climate effects of the emissions of near-term climate forcers (NTCFs) from 4 continental regions (East Asia, Europe, North America and South Asia) using radiative forcing from the task force on hemispheric transport of air pollution source-receptor global chemical transport model simulations. These simulations model the transport of 3 aerosol species (sulphate, particulate organic matter and black carbon) and 4 ozone precursors (methane, nitric oxides (NOx), volatile organic compounds and carbon monoxide). From the equilibrium radiative forcing results we calculate global climate metrics, global warming potentials (GWPs) and global temperature change potentials (GTPs) and show how these depend on emission region, and can vary as functions of time. For the aerosol species, the GWP(100) values are −37±12, −46±20, and 350±200 for SO2, POM and BC respectively for the direct effects only. The corresponding GTP(100) values are −5.2±2.4, −6.5±3.5, and 50±33. This analysis is further extended by examining the temperature-change impacts in 4 latitude bands. This shows that the latitudinal pattern of the temperature response to emissions of the NTCFs does not directly follow the pattern of the diagnosed radiative forcing. For instance temperatures in the Arctic latitudes are particularly sensitive to NTCF emissions in the northern mid-latitudes. At the 100-yr time horizon the ARTPs show NOx emissions can have a warming effect in the northern mid and high latitudes, but cooling in the tropics and Southern Hemisphere. The northern mid-latitude temperature response to northern mid-latitude emissions of most NTCFs is approximately twice as large as would be implied by the global average.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model simulations of the next few decades are widely used in assessments of climate change impacts and as guidance for adaptation. Their non-linear nature reveals a level of irreducible uncertainty which it is important to understand and quantify, especially for projections of near-term regional climate. Here we use large idealised initial condition ensembles of the FAMOUS global climate model with a 1 %/year compound increase in CO2 levels to quantify the range of future temperatures in model-based projections. These simulations explore the role of both atmospheric and oceanic initial conditions and are the largest such ensembles to date. Short-term simulated trends in global temperature are diverse, and cooling periods are more likely to be followed by larger warming rates. The spatial pattern of near-term temperature change varies considerably, but the proportion of the surface showing a warming is more consistent. In addition, ensemble spread in inter-annual temperature declines as the climate warms, especially in the North Atlantic. Over Europe, atmospheric initial condition uncertainty can, for certain ocean initial conditions, lead to 20 year trends in winter and summer in which every location can exhibit either strong cooling or rapid warming. However, the details of the distribution are highly sensitive to the ocean initial condition chosen and particularly the state of the Atlantic meridional overturning circulation. On longer timescales, the warming signal becomes more clear and consistent amongst different initial condition ensembles. An ensemble using a range of different oceanic initial conditions produces a larger spread in temperature trends than ensembles using a single ocean initial condition for all lead times. This highlights the potential benefits from initialising climate predictions from ocean states informed by observations. These results suggest that climate projections need to be performed with many more ensemble members than at present, using a range of ocean initial conditions, if the uncertainty in near-term regional climate is to be adequately quantified.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Geological carbon dioxide storage (CCS) has the potential to make a significant contribution to the decarbonisation of the UK. Amid concerns over maintaining security, and hence diversity, of supply, CCS could allow the continued use of coal, oil and gas whilst avoiding the CO2 emissions currently associated with fossil fuel use. This project has explored some of the geological, environmental, technical, economic and social implications of this technology. The UK is well placed to exploit CCS with a large offshore storage capacity, both in disused oil and gas fields and saline aquifers. This capacity should be sufficient to store CO2 from the power sector (at current levels) for a least one century, using well understood and therefore likely to be lower-risk, depleted hydrocarbon fields and contained parts of aquifers. It is very difficult to produce reliable estimates of the (potentially much larger) storage capacity of the less well understood geological reservoirs such as non-confined parts of aquifers. With the majority of its large coal fired power stations due to be retired during the next 15 to 20 years, the UK is at a natural decision point with respect to the future of power generation from coal; the existence of both national reserves and the infrastructure for receiving imported coal makes clean coal technology a realistic option. The notion of CCS as a ‘bridging’ or ‘stop-gap’ technology (i.e. whilst we develop ‘genuinely’ sustainable renewable energy technologies) needs to be examined somewhat critically, especially given the scale of global coal reserves. If CCS plant is built, then it is likely that technological innovation will bring down the costs of CO2 capture, such that it could become increasingly attractive. As with any capitalintensive option, there is a danger of becoming ‘locked-in’ to a CCS system. The costs of CCS in our model for UK power stations in the East Midlands and Yorkshire to reservoirs in the North Sea are between £25 and £60 per tonne of CO2 captured, transported and stored. This is between about 2 and 4 times the current traded price of a tonne of CO2 in the EU Emissions Trading Scheme. In addition to the technical and economic requirements of the CCS technology, it should also be socially and environmentally acceptable. Our research has shown that, given an acceptance of the severity and urgency of addressing climate change, CCS is viewed favourably by members of the public, provided it is adopted within a portfolio of other measures. The most commonly voiced concern from the public is that of leakage and this remains perhaps the greatest uncertainty with CCS. It is not possible to make general statements concerning storage security; assessments must be site specific. The impacts of any potential leakage are also somewhat uncertain but should be balanced against the deleterious effects of increased acidification in the oceans due to uptake of elevated atmospheric CO2 that have already been observed. Provided adequate long term monitoring can be ensured, any leakage of CO2 from a storage site is likely to have minimal localised impacts as long as leaks are rapidly repaired. A regulatory framework for CCS will need to include risk assessment of potential environmental and health and safety impacts, accounting and monitoring and liability for the long term. In summary, although there remain uncertainties to be resolved through research and demonstration projects, our assessment demonstrates that CCS holds great potential for significant cuts in CO2 emissions as we develop long term alternatives to fossil fuel use. CCS can contribute to reducing emissions of CO2 into the atmosphere in the near term (i.e. peak-shaving the future atmospheric concentration of CO2), with the potential to continue to deliver significant CO2 reductions over the long term.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper a look is taken at how the use of implant technology can be used to either increase the range of the abilities of a human and/or diminish the effects of a neural illness, such as Parkinson's Disease. The key element is the need for a clear interface linking the human brain directly with a computer. The area of interest here is the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Pilot tests and experimentation are invariably carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed here. The paper goes on to describe human experimentation, in particular that carried out by the author himself, which led to him receiving a neural implant which linked his nervous system bi-directionally with the internet. With this in place neural signals were transmitted to various technological devices to directly control them. In particular, feedback to the brain was obtained from the fingertips of a robot hand and ultrasonic (extra) sensory input. A view is taken as to the prospects for the future, both in the near term as a therapeutic device and in the long term as a form of enhancement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A range of forecasts of global oil production made between 1956 and the present day are listed. For the majority of these the methodology used to generate the forecast is described. The paper distinguishes between three types of forecast: group 1-quantitative analyses which predict that global oil production will reach a resource-limited peak in the near term, and certainly before the year 2020; group 2-forecasts that use quantitative methods, but which see no production peak within the forecast's time horizon (typically 2020 or 2030); group 3-nonquantitative analyses that rule out a resource-limited oil peak within the foreseeable future. The paper analyses these forecast types and suggests that group 1 forecasts are the most realistic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines aspects of the case against global oil peaking, and in particular sets out to answer a viewpoint that the world can have abundant supplies of oil "for years to come". Arguments supporting the latter view include: past forecasts of oil shortage have proved incorrect, so current predictions should also be discounted; many modellers depend on Hubbert's analysis but this contained fundamental flaws; new oil supply will result from reserves growth and from the wider deployment of advanced extraction technology; and that the world contains large resources of unconventional oil that can come on-stream if the production of conventional oil declines. These arguments are examined in turn and shown to be incorrect, or to need setting into a broader context. The paper concludes therefore that such arguments cannot be used to rule out calculations that the resource-limited peak in the world's production of conventional oil will occur in the near term. Moreover, peaking of conventional oil is likely to impact the world's total availability of oil where the latter includes non-conventional oil and oil substitutes. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Metafor project has developed a common information model (CIM) using the ISO19100 series for- malism to describe numerical experiments carried out by the Earth system modelling community, the models they use, and the simulations that result. Here we describe the mechanism by which the CIM was developed, and its key properties. We introduce the conceptual and application ver- sions and the controlled vocabularies developed in the con- text of supporting the fifth Coupled Model Intercomparison Project (CMIP5). We describe how the CIM has been used in experiments to describe model coupling properties and de- scribe the near term expected evolution of the CIM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The efficiency with which the oceans take up heat has a significant influence on the rate of global warming. Warming of the ocean above 700 m over the past few decades has been well documented. However, most of the ocean lies below 700 m. Here we analyse observations of heat uptake into the deep North Atlantic. We find that the extratropical North Atlantic as a whole warmed by 1.45±0.5×1022 J between 1955 and 2005, but Lower North Atlantic Deep Water cooled, most likely as an adjustment from an early twentieth-century warm period. In contrast, the heat content of Upper North Atlantic Deep Water exhibited strong decadal variability. We demonstrate and quantify the importance of density-compensated temperature anomalies for long-term heat uptake into the deep North Atlantic. These anomalies form in the subpolar gyre and propagate equatorwards. High salinity in the subpolar gyre is a key requirement for this mechanism. In the past 50 years, suitable conditions have occurred only twice: first during the 1960s and again during the past decade. We conclude that heat uptake through density-compensated temperature anomalies will contribute to deep ocean heat uptake in the near term. In the longer term, the importance of this mechanism will be determined by competition between the multiple processes that influence subpolar gyre salinity in a changing climate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Improved crop yield forecasts could enable more effective adaptation to climate variability and change. Here, we explore how to combine historical observations of crop yields and weather with climate model simulations to produce crop yield projections for decision relevant timescales. Firstly, the effects on historical crop yields of improved technology, precipitation and daily maximum temperatures are modelled empirically, accounting for a nonlinear technology trend and interactions between temperature and precipitation, and applied specifically for a case study of maize in France. The relative importance of precipitation variability for maize yields in France has decreased significantly since the 1960s, likely due to increased irrigation. In addition, heat stress is found to be as important for yield as precipitation since around 2000. A significant reduction in maize yield is found for each day with a maximum temperature above 32 °C, in broad agreement with previous estimates. The recent increase in such hot days has likely contributed to the observed yield stagnation. Furthermore, a general method for producing near-term crop yield projections, based on climate model simulations, is developed and utilized. We use projections of future daily maximum temperatures to assess the likely change in yields due to variations in climate. Importantly, we calibrate the climate model projections using observed data to ensure both reliable temperature mean and daily variability characteristics, and demonstrate that these methods work using retrospective predictions. We conclude that, to offset the projected increased daily maximum temperatures over France, improved technology will need to increase base level yields by 12% to be confident about maintaining current levels of yield for the period 2016–2035; the current rate of yield technology increase is not sufficient to meet this target.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper provides an update on research in the relatively new and fast-moving field of decadal climate prediction, and addresses the use of decadal climate predictions not only for potential users of such information but also for improving our understanding of processes in the climate system. External forcing influences the predictions throughout, but their contributions to predictive skill become dominant after most of the improved skill from initialization with observations vanishes after about six to nine years. Recent multi-model results suggest that there is relatively more decadal predictive skill in the North Atlantic, western Pacific, and Indian Oceans than in other regions of the world oceans. Aspects of decadal variability of SSTs, like the mid-1970s shift in the Pacific, the mid-1990s shift in the northern North Atlantic and western Pacific, and the early-2000s hiatus, are better represented in initialized hindcasts compared to uninitialized simulations. There is evidence of higher skill in initialized multi-model ensemble decadal hindcasts than in single model results, with multi-model initialized predictions for near term climate showing somewhat less global warming than uninitialized simulations. Some decadal hindcasts have shown statistically reliable predictions of surface temperature over various land and ocean regions for lead times of up to 6-9 years, but this needs to be investigated in a wider set of models. As in the early days of El Niño-Southern Oscillation (ENSO) prediction, improvements to models will reduce the need for bias adjustment, and increase the reliability, and thus usefulness, of decadal climate predictions in the future.