66 resultados para poetry of XXth century


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article seeks to explore the absence of the body in the depiction of dying women in a selection of seventeenth-century diaries. It considers the cultural forces that made this absence inevitable, and the means by which the physical body was replaced in death by a spiritual presence. The elevation of a dying woman from physical carer to spiritual nurturer in the days before death ensured that gender codes were not broken. The centrality of the body of the dying woman, within a female circle of care and support, was paradoxically juxtaposed with an effacement of the body in descriptions of a good death. In death, a woman might achieve the stillness, silence and compliance so essential to perfect early modern womanhood, and retrospective diary entries can achieve this ideal by replacing the body with images that deflect from the essential physicality of the woman.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comparison of single-forcing varieties of 20th century historical experiments in a subset of models from the Fifth Coupled Model Intercomparison Project (CMIP5) reveals that South Asian summer monsoon rainfall increases towards the present day in Greenhouse Gas (GHG)-only experiments with respect to pre-industrial levels, while it decreases in anthropogenic aerosol-only experiments. Comparison of these single-forcing experiments with the all-forcings historical experiment suggests aerosol emissions have dominated South Asian monsoon rainfall trends in recent decades, especially during the 1950s to 1970s. The variations in South Asian monsoon rainfall in these experiments follows approximately the time evolution of inter-hemispheric temperature gradient over the same period, suggesting a contribution from the large-scale background state relating to the asymmetric distribution of aerosol emissions about the equator. By examining the 24 available all-forcings historical experiments, we show that models including aerosol indirect effects dominate the negative rainfall trend. Indeed, models including only the direct radiative effect of aerosol show an increase in monsoon rainfall, consistent with the dominance of increasing greenhouse gas emissions and planetary warming on monsoon rainfall in those models. For South Asia, reduced rainfall in the models with indirect effects is related to decreased evaporation at the land surface rather than from anomalies in horizontal moisture flux, suggesting the impact of indirect effects on local aerosol emissions. This is confirmed by examination of aerosol loading and cloud droplet number trends over the South Asia region. Thus, while remote aerosols and their asymmetric distribution about the equator play a role in setting the inter-hemispheric temperature distribution on which the South Asian monsoon, as one of the global monsoons, operates, the addition of indirect aerosol effects acting on very local aerosol emissions also plays a role in declining monsoon rainfall. The disparity between the response of monsoon rainfall to increasing aerosol emissions in models containing direct aerosol effects only and those also containing indirect effects needs to be urgently investigated since the suggested future decline in Asian anthropogenic aerosol emissions inherent to the representative concentration pathways (RCPs) used for future climate projection may turn out to be optimistic. In addition, both groups of models show declining rainfall over China, also relating to local aerosol mechanisms. We hypothesize that aerosol emissions over China are large enough, in the CMIP5 models, to cause declining monsoon rainfall even in the absence of indirect aerosol effects. The same is not true for India.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An anthology (comprising introduction, text, translation, and notes) of Britain's most ancient (surviving) poetry (Latin/Greek, with an English translation).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The response of stratospheric climate and circulation to increasing amounts of greenhouse gases (GHGs) and ozone recovery in the twenty-first century is analyzed in simulations of 11 chemistry–climate models using near-identical forcings and experimental setup. In addition to an overall global cooling of the stratosphere in the simulations (0.59 6 0.07 K decade21 at 10 hPa), ozone recovery causes a warming of the Southern Hemisphere polar lower stratosphere in summer with enhanced cooling above. The rate of warming correlates with the rate of ozone recovery projected by the models and, on average, changes from 0.8 to 0.48 Kdecade21 at 100 hPa as the rate of recovery declines from the first to the second half of the century. In the winter northern polar lower stratosphere the increased radiative cooling from the growing abundance of GHGs is, in most models, balanced by adiabatic warming from stronger polar downwelling. In the Antarctic lower stratosphere the models simulate an increase in low temperature extremes required for polar stratospheric cloud (PSC) formation, but the positive trend is decreasing over the twenty-first century in all models. In the Arctic, none of the models simulates a statistically significant increase in Arctic PSCs throughout the twenty-first century. The subtropical jets accelerate in response to climate change and the ozone recovery produces awestward acceleration of the lower-stratosphericwind over theAntarctic during summer, though this response is sensitive to the rate of recovery projected by the models. There is a strengthening of the Brewer–Dobson circulation throughout the depth of the stratosphere, which reduces the mean age of air nearly everywhere at a rate of about 0.05 yr decade21 in those models with this diagnostic. On average, the annual mean tropical upwelling in the lower stratosphere (;70 hPa) increases by almost 2% decade21, with 59% of this trend forced by the parameterized orographic gravity wave drag in the models. This is a consequence of the eastward acceleration of the subtropical jets, which increases the upward flux of (parameterized) momentum reaching the lower stratosphere in these latitudes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of stratospheric ozone from 1960 to 2100 is examined in simulations from 14 chemistry‐climate models, driven by prescribed levels of halogens and greenhouse gases. There is general agreement among the models that total column ozone reached a minimum around year 2000 at all latitudes, projected to be followed by an increase over the first half of the 21st century. In the second half of the 21st century, ozone is projected to continue increasing, level off, or even decrease depending on the latitude. Separation into partial columns above and below 20 hPa reveals that these latitudinal differences are almost completely caused by differences in the model projections of ozone in the lower stratosphere. At all latitudes, upper stratospheric ozone increases throughout the 21st century and is projected to return to 1960 levels well before the end of the century, although there is a spread among models in the dates that ozone returns to specific historical values. We find decreasing halogens and declining upper atmospheric temperatures, driven by increasing greenhouse gases, contribute almost equally to increases in upper stratospheric ozone. In the tropical lower stratosphere, an increase in upwelling causes a steady decrease in ozone through the 21st century, and total column ozone does not return to 1960 levels in most of the models. In contrast, lower stratospheric and total column ozone in middle and high latitudes increases during the 21st century, returning to 1960 levels well before the end of the century in most models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Confidence in projections of global-mean sea level rise (GMSLR) depends on an ability to account for GMSLR during the twentieth century. There are contributions from ocean thermal expansion, mass loss from glaciers and ice sheets, groundwater extraction, and reservoir impoundment. Progress has been made toward solving the “enigma” of twentieth-century GMSLR, which is that the observed GMSLR has previously been found to exceed the sum of estimated contributions, especially for the earlier decades. The authors propose the following: thermal expansion simulated by climate models may previously have been underestimated because of their not including volcanic forcing in their control state; the rate of glacier mass loss was larger than previously estimated and was not smaller in the first half than in the second half of the century; the Greenland ice sheet could have made a positive contribution throughout the century; and groundwater depletion and reservoir impoundment, which are of opposite sign, may have been approximately equal in magnitude. It is possible to reconstruct the time series of GMSLR from the quantified contributions, apart from a constant residual term, which is small enough to be explained as a long-term contribution from the Antarctic ice sheet. The reconstructions account for the observation that the rate of GMSLR was not much larger during the last 50 years than during the twentieth century as a whole, despite the increasing anthropogenic forcing. Semiempirical methods for projecting GMSLR depend on the existence of a relationship between global climate change and the rate of GMSLR, but the implication of the authors' closure of the budget is that such a relationship is weak or absent during the twentieth century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the impact of aerosol forcing uncertainty on the robustness of estimates of the twentieth-century warming attributable to anthropogenic greenhouse gas emissions. Attribution analyses on three coupled climate models with very different sensitivities and aerosol forcing are carried out. The Third Hadley Centre Coupled Ocean - Atmosphere GCM (HadCM3), Parallel Climate Model (PCM), and GFDL R30 models all provide good simulations of twentieth-century global mean temperature changes when they include both anthropogenic and natural forcings. Such good agreement could result from a fortuitous cancellation of errors, for example, by balancing too much ( or too little) greenhouse warming by too much ( or too little) aerosol cooling. Despite a very large uncertainty for estimates of the possible range of sulfate aerosol forcing obtained from measurement campaigns, results show that the spatial and temporal nature of observed twentieth-century temperature change constrains the component of past warming attributable to anthropogenic greenhouse gases to be significantly greater ( at the 5% level) than the observed warming over the twentieth century. The cooling effects of aerosols are detected in all three models. Both spatial and temporal aspects of observed temperature change are responsible for constraining the relative roles of greenhouse warming and sulfate cooling over the twentieth century. This is because there are distinctive temporal structures in differential warming rates between the hemispheres, between land and ocean, and between mid- and low latitudes. As a result, consistent estimates of warming attributable to greenhouse gas emissions are obtained from all three models, and predictions are relatively robust to the use of more or less sensitive models. The transient climate response following a 1% yr(-1) increase in CO2 is estimated to lie between 2.2 and 4 K century(-1) (5-95 percentiles).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A suite of climate model experiments indicates that 20th Century increases in ocean heat content and sea-level ( via thermal expansion) were substantially reduced by the 1883 eruption of Krakatoa. The volcanically-induced cooling of the ocean surface is subducted into deeper ocean layers, where it persists for decades. Temporary reductions in ocean heat content associated with the comparable eruptions of El Chichon ( 1982) and Pinatubo ( 1991) were much shorter lived because they occurred relative to a non-stationary background of large, anthropogenically-forced ocean warming. Our results suggest that inclusion of the effects of Krakatoa ( and perhaps even earlier eruptions) is important for reliable simulation of 20th century ocean heat uptake and thermal expansion. Inter-model differences in the oceanic thermal response to Krakatoa are large and arise from differences in external forcing, model physics, and experimental design. Systematic experimentation is required to quantify the relative importance of these factors. The next generation of historical forcing experiments may require more careful treatment of pre-industrial volcanic aerosol loadings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We separate and quantify the sources of uncertainty in projections of regional (*2,500 km) precipitation changes for the twenty-first century using the CMIP3 multi-model ensemble, allowing a direct comparison with a similar analysis for regional temperature changes. For decadal means of seasonal mean precipitation, internal variability is the dominant uncertainty for predictions of the first decade everywhere, and for many regions until the third decade ahead. Model uncertainty is generally the dominant source of uncertainty for longer lead times. Scenario uncertainty is found to be small or negligible for all regions and lead times, apart from close to the poles at the end of the century. For the global mean, model uncertainty dominates at all lead times. The signal-to-noise ratio (S/N) of the precipitation projections is highest at the poles but less than 1 almost everywhere else, and is far lower than for temperature projections. In particular, the tropics have the highest S/N for temperature, but the lowest for precipitation. We also estimate a ‘potential S/N’ by assuming that model uncertainty could be reduced to zero, and show that, for regional precipitation, the gains in S/N are fairly modest, especially for predictions of the next few decades. This finding suggests that adaptation decisions will need to be made in the context of high uncertainty concerning regional changes in precipitation. The potential to narrow uncertainty in regional temperature projections is far greater. These conclusions on S/N are for the current generation of models; the real signal may be larger or smaller than the CMIP3 multi-model mean. Also note that the S/N for extreme precipitation, which is more relevant for many climate impacts, may be larger than for the seasonal mean precipitation considered here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On the time scale of a century, the Atlantic thermohaline circulation (THC) is sensitive to the global surface salinity distribution. The advection of salinity toward the deep convection sites of the North Atlantic is one of the driving mechanisms for the THC. There is both a northward and a southward contributions. The northward salinity advection (Nsa) is related to the evaporation in the subtropics, and contributes to increased salinity in the convection sites. The southward salinity advection (Ssa) is related to the Arctic freshwater forcing and tends on the contrary to diminish salinity in the convection sites. The THC changes results from a delicate balance between these opposing mechanisms. In this study we evaluate these two effects using the IPSL-CM4 ocean-atmosphere-sea-ice coupled model (used for IPCC AR4). Perturbation experiments have been integrated for 100 years under modern insolation and trace gases. River runoff and evaporation minus precipitation are successively set to zero for the ocean during the coupling procedure. This allows the effect of processes Nsa and Ssa to be estimated with their specific time scales. It is shown that the convection sites in the North Atlantic exhibit various sensitivities to these processes. The Labrador Sea exhibits a dominant sensitivity to local forcing and Ssa with a typical time scale of 10 years, whereas the Irminger Sea is mostly sensitive to Nsa with a 15 year time scale. The GIN Seas respond to both effects with a time scale of 10 years for Ssa and 20 years for Nsa. It is concluded that, in the IPSL-CM4, the global freshwater forcing damps the THC on centennial time scales.

Relevância:

100.00% 100.00%

Publicador: