940 resultados para American--20th century


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This essay appears in the first book to examine feminist curatorship in the last 40 years. It undertakes an extended reading of Cathy de Zegher's influential exhibition, Inside the Visible, An Elliptical Traverse of 20th Century Art. In, of and From the Feminine (1995) which proposed that modern art should be understood through cyclical shifts involving the constant reinvention of artistic method and identified four key moments in 20th century history to structure its project. The essay analyses Inside the Visible's concept of an elliptical traverse to raise questions about repetitions and recurrences in feminist exhibitions of the early 1980s, the mid 1990s and 2007 asking whether and in what ways questions of feminist curating have been continuously repeated and reinvented. The essay argues that Inside the Visible was a key project in second wave feminism and exemplified debates about women's time, first theorised by Julia Kristeva. It concludes, however, that 'women's time' has had its moment, and new conceptions of feminism and its history are needed if feminist curating is not endlessly to recycle its past. The essay informs a wider collaborative project on the sexual politics of violence, feminism and contemporary art, in collaboration with Edinburgh and one of the editors of this collection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The observed decline in summer sea ice extent since the 1970s is predicted to continue until the Arctic Ocean is seasonally ice free during the 21st Century. This will lead to a much perturbed Arctic climate with large changes in ocean surface energy flux. Svalbard, located on the present day sea ice edge, contains many low lying ice caps and glaciers and is expected to experience rapid warming over the 21st Century. The total sea level rise if all the land ice on Svalbard were to melt completely is 0.02 m. The purpose of this study is to quantify the impact of climate change on Svalbard’s surface mass balance (SMB) and to determine, in particular, what proportion of the projected changes in precipitation and SMB are a result of changes to the Arctic sea ice cover. To investigate this a regional climate model was forced with monthly mean climatologies of sea surface temperature (SST) and sea ice concentration for the periods 1961–1990 and 2061–2090 under two emission scenarios. In a novel forcing experiment, 20th Century SSTs and 21st Century sea ice were used to force one simulation to investigate the role of sea ice forcing. This experiment results in a 3.5 m water equivalent increase in Svalbard’s SMB compared to the present day. This is because over 50 % of the projected increase in winter precipitation over Svalbard under the A1B emissions scenario is due to an increase in lower atmosphere moisture content associated with evaporation from the ice free ocean. These results indicate that increases in precipitation due to sea ice decline may act to moderate mass loss from Svalbard’s glaciers due to future Arctic warming.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Queensland experiences considerable inter-annual and decadal rainfall variability, which impacts water-resource management, agriculture and infrastructure. To understand the mechanisms by which large-scale atmospheric and coupled air–sea processes drive these variations, empirical orthogonal teleconnection (EOT) analysis is applied to 1900–2010 seasonal Queensland rainfall. Fields from observations and the 20th Century Reanalysis are regressed onto the EOT timeseries to associate the EOTs with large-scale drivers. In winter, spring and summer the leading, state-wide EOTs are highly correlated with the El Nino–Southern Oscillation (ENSO); the Inter-decadal Pacific Oscillation modulates the summer ENSO teleconnection. In autumn, the leading EOT is associated with locally driven, late-season monsoon variations, while ENSO affects only tropical northern Queensland. Examining EOTs beyond the first, southeastern Queensland and the Cape York peninsula emerge as regions of coherent rainfall variability. In the southeast, rainfall anomalies respond to the strength and moisture content of onshore easterlies, controlled by Tasman Sea blocking. The summer EOT associated with onshore flow and blocking has been negative since 1970, consistent with the observed decline in rainfall along the heavily populated coast. The southeastern Queensland EOTs show considerable multi-decadal variability, which is independent of large-scale drivers. Summer rainfall in Cape York is associated with tropical-cyclone activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analysis of 20th century simulations of the High resolution Global Environment Model (HiGEM) and the Third Coupled Model Intercomparison Project (CMIP3) models shows that most have a cold sea-surface temperature (SST) bias in the northern Arabian Sea during boreal winter. The association between Arabian Sea SST and the South Asian monsoon has been widely studied in observations and models, with winter cold biases known to be detrimental to rainfall simulation during the subsequent monsoon in coupled general circulation models (GCMs). However, the causes of these SST biases are not well understood. Indeed this is one of the first papers to address causes of the cold biases. The models show anomalously strong north-easterly winter monsoon winds and cold air temperatures in north-west India, Pakistan and beyond. This leads to the anomalous advection of cold, dry air over the Arabian Sea. The cold land region is also associated with an anomalously strong meridional surface temperature gradient during winter, contributing to the enhanced low-level convergence and excessive precipitation over the western equatorial Indian Ocean seen in many models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Walker circulation is one of the major components of the large-scale tropical atmospheric circulation and variations in its strength are critical to equatorial Pacific Ocean circulation. It has been argued in the literature that during the 20th century the Walker circulation weakened, and that this weakening was attributable to anthropogenic climate change. By using updated observations, we show that there has been a rapid interdecadal enhancement of the Walker circulation since the late 1990s. Associated with this enhancement is enhanced precipitation in the tropical western Pacific, anomalous westerlies in the upper troposphere, descent in the central and eastern tropical Pacific, and anomalous surface easterlies in the western and central tropical Pacific. The characteristics of associated oceanic changes are a strengthened thermocline slope and an enhanced zonal SST gradient across the tropical Pacific. Many characteristics of these changes are similar to those associated with the mid-1970s climate shift with an opposite sign. We also show that the interdecadal variability of the Walker circulation in the tropical Pacific is inversely correlated to the interdecadal variability of the zonal circulation in the tropical Atlantic. An enhancement of the Walker circulation in the tropical Pacific is associated with a weakening zonal circulation in the tropical Atlantic and vise versa, implying an inter-Atlantic-Pacific connection of the zonal overturning circulation variation. Whether these recent changes will be sustained is not yet clear, but our research highlights the importance of understanding the interdecadal variability, as well as the long-term trends, that influence tropical circulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cities have developed into the hotspots of human economic activity. From the appearance of the first cities in the Neolithic to 21st century metropolis their impact on the environment has always been apparent. With more people living in cities than in rural environments now it becomes crucial to understand these environmental impacts. With the immergence of megacities in the 20th century and their continued growth in both, population and economic power, the environmental impact has reached the global scale. In this paper we examine megacity impacts on atmospheric composition and climate. We present basic concepts, discuss various definitions of footprints, summarize research on megacity impacts and assess the impact of megacity emissions on air quality and on the climate at the regional to global scale. The intention and ambition of this paper is to give a comprehensive but brief overview of the science with regard to megacities and the environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mass loss by glaciers has been an important contributor to sea level rise in the past, and is projected to contribute a substantial fraction of total sea level rise during the 21st century. Here, we use a model of the world's glaciers to quantify equilibrium sensitivities of global glacier mass to climate change, and to investigate the role of changes in glacier hypsometry for long-term mass changes. We find that 21st century glacier-mass loss is largely governed by the glacier's response to 20th century climate change. This limits the influence of 21st century climate change on glacier-mass loss, and explains why there are relatively small differences in glacier-mass loss under greatly different scenarios of climate change. The projected future changes in both temperature and precipitation experienced by glaciers are amplified relative to the global average. The projected increase in precipitation partly compensates for the mass loss caused by warming, but this compensation is negligible at higher temperature anomalies since an increasing fraction of precipitation at the glacier sites is liquid. Loss of low-lying glacier area, and more importantly, eventual complete disappearance of glaciers, strongly limit the projected sea level contribution from glaciers in coming centuries. The adjustment of glacier hypsometry to changes in the forcing strongly reduces the rates of global glacier-mass loss caused by changes in global mean temperature compared to rates of mass loss when hypsometric changes are neglected. This result is a second reason for the relatively weak dependence of glacier-mass loss on future climate scenario, and helps explain why glacier-mass loss in the first half of the 20th century was of the same order of magnitude as in the second half of the 20th century, even though the rate of warming was considerably smaller.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Results from all phases of the orbits of the Ulysses spacecraft have shown that the magnitude of the radial component of the heliospheric field is approximately independent of heliographic latitude. This result allows the use of near- Earth observations to compute the total open flux of the Sun. For example, using satellite observations of the interplanetary magnetic field, the average open solar flux was shown to have risen by 29% between 1963 and 1987 and using the aa geomagnetic index it was found to have doubled during the 20th century. It is therefore important to assess fully the accuracy of the result and to check that it applies to all phases of the solar cycle. The first perihelion pass of the Ulysses spacecraft was close to sunspot minimum, and recent data from the second perihelion pass show that the result also holds at solar maximum. The high level of correlation between the open flux derived from the various methods strongly supports the Ulysses discovery that the radial field component is independent of latitude. We show here that the errors introduced into open solar flux estimates by assuming that the heliospheric field’s radial component is independent of latitude are similar for the two passes and are of order 25% for daily values, falling to 5% for averaging timescales of 27 days or greater. We compare here the results of four methods for estimating the open solar flux with results from the first and second perehelion passes by Ulysses. We find that the errors are lowest (1–5% for averages over the entire perehelion passes lasting near 320 days), for near-Earth methods, based on either interplanetary magnetic field observations or the aa geomagnetic activity index. The corresponding errors for the Solanki et al. (2000) model are of the order of 9–15% and for the PFSS method, based on solar magnetograms, are of the order of 13–47%. The model of Solanki et al. is based on the continuity equation of open flux, and uses the sunspot number to quantify the rate of open flux emergence. It predicts that the average open solar flux has been decreasing since 1987, as Correspondence to: M. Lockwood (m.lockwood@rl.ac.uk) is observed in the variation of all the estimates of the open flux. This decline combines with the solar cycle variation to produce an open flux during the second (sunspot maximum) perihelion pass of Ulysses which is only slightly larger than that during the first (sunspot minimum) perihelion pass.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines the atmospheric circulation patterns and surface features associated with the seven coldest winters in the U.K. since 1870, using the 20th Century Reanalysis. Six of these winters are outside the scope of previous reanalysis datasets; we examine them here for the first time. All winters show a marked lack of the climatological southwesterly flow over the UK, displaying easterly and northeasterly anomalies. Six of the seven winters (all except 1890) were associated with a negative phase of the North Atlantic Oscillation; 1890 was characterised by a blocking anticyclone over and northeast of the UK.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.