63 resultados para American--20th century
Resumo:
Queensland experiences considerable inter-annual and decadal rainfall variability, which impacts water-resource management, agriculture and infrastructure. To understand the mechanisms by which large-scale atmospheric and coupled air–sea processes drive these variations, empirical orthogonal teleconnection (EOT) analysis is applied to 1900–2010 seasonal Queensland rainfall. Fields from observations and the 20th Century Reanalysis are regressed onto the EOT timeseries to associate the EOTs with large-scale drivers. In winter, spring and summer the leading, state-wide EOTs are highly correlated with the El Nino–Southern Oscillation (ENSO); the Inter-decadal Pacific Oscillation modulates the summer ENSO teleconnection. In autumn, the leading EOT is associated with locally driven, late-season monsoon variations, while ENSO affects only tropical northern Queensland. Examining EOTs beyond the first, southeastern Queensland and the Cape York peninsula emerge as regions of coherent rainfall variability. In the southeast, rainfall anomalies respond to the strength and moisture content of onshore easterlies, controlled by Tasman Sea blocking. The summer EOT associated with onshore flow and blocking has been negative since 1970, consistent with the observed decline in rainfall along the heavily populated coast. The southeastern Queensland EOTs show considerable multi-decadal variability, which is independent of large-scale drivers. Summer rainfall in Cape York is associated with tropical-cyclone activity.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
Analysis of 20th century simulations of the High resolution Global Environment Model (HiGEM) and the Third Coupled Model Intercomparison Project (CMIP3) models shows that most have a cold sea-surface temperature (SST) bias in the northern Arabian Sea during boreal winter. The association between Arabian Sea SST and the South Asian monsoon has been widely studied in observations and models, with winter cold biases known to be detrimental to rainfall simulation during the subsequent monsoon in coupled general circulation models (GCMs). However, the causes of these SST biases are not well understood. Indeed this is one of the first papers to address causes of the cold biases. The models show anomalously strong north-easterly winter monsoon winds and cold air temperatures in north-west India, Pakistan and beyond. This leads to the anomalous advection of cold, dry air over the Arabian Sea. The cold land region is also associated with an anomalously strong meridional surface temperature gradient during winter, contributing to the enhanced low-level convergence and excessive precipitation over the western equatorial Indian Ocean seen in many models.
Resumo:
At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.
Resumo:
Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.
Resumo:
Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.
Resumo:
The Walker circulation is one of the major components of the large-scale tropical atmospheric circulation and variations in its strength are critical to equatorial Pacific Ocean circulation. It has been argued in the literature that during the 20th century the Walker circulation weakened, and that this weakening was attributable to anthropogenic climate change. By using updated observations, we show that there has been a rapid interdecadal enhancement of the Walker circulation since the late 1990s. Associated with this enhancement is enhanced precipitation in the tropical western Pacific, anomalous westerlies in the upper troposphere, descent in the central and eastern tropical Pacific, and anomalous surface easterlies in the western and central tropical Pacific. The characteristics of associated oceanic changes are a strengthened thermocline slope and an enhanced zonal SST gradient across the tropical Pacific. Many characteristics of these changes are similar to those associated with the mid-1970s climate shift with an opposite sign. We also show that the interdecadal variability of the Walker circulation in the tropical Pacific is inversely correlated to the interdecadal variability of the zonal circulation in the tropical Atlantic. An enhancement of the Walker circulation in the tropical Pacific is associated with a weakening zonal circulation in the tropical Atlantic and vise versa, implying an inter-Atlantic-Pacific connection of the zonal overturning circulation variation. Whether these recent changes will be sustained is not yet clear, but our research highlights the importance of understanding the interdecadal variability, as well as the long-term trends, that influence tropical circulation.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
Cities have developed into the hotspots of human economic activity. From the appearance of the first cities in the Neolithic to 21st century metropolis their impact on the environment has always been apparent. With more people living in cities than in rural environments now it becomes crucial to understand these environmental impacts. With the immergence of megacities in the 20th century and their continued growth in both, population and economic power, the environmental impact has reached the global scale. In this paper we examine megacity impacts on atmospheric composition and climate. We present basic concepts, discuss various definitions of footprints, summarize research on megacity impacts and assess the impact of megacity emissions on air quality and on the climate at the regional to global scale. The intention and ambition of this paper is to give a comprehensive but brief overview of the science with regard to megacities and the environment.
Resumo:
Mass loss by glaciers has been an important contributor to sea level rise in the past, and is projected to contribute a substantial fraction of total sea level rise during the 21st century. Here, we use a model of the world's glaciers to quantify equilibrium sensitivities of global glacier mass to climate change, and to investigate the role of changes in glacier hypsometry for long-term mass changes. We find that 21st century glacier-mass loss is largely governed by the glacier's response to 20th century climate change. This limits the influence of 21st century climate change on glacier-mass loss, and explains why there are relatively small differences in glacier-mass loss under greatly different scenarios of climate change. The projected future changes in both temperature and precipitation experienced by glaciers are amplified relative to the global average. The projected increase in precipitation partly compensates for the mass loss caused by warming, but this compensation is negligible at higher temperature anomalies since an increasing fraction of precipitation at the glacier sites is liquid. Loss of low-lying glacier area, and more importantly, eventual complete disappearance of glaciers, strongly limit the projected sea level contribution from glaciers in coming centuries. The adjustment of glacier hypsometry to changes in the forcing strongly reduces the rates of global glacier-mass loss caused by changes in global mean temperature compared to rates of mass loss when hypsometric changes are neglected. This result is a second reason for the relatively weak dependence of glacier-mass loss on future climate scenario, and helps explain why glacier-mass loss in the first half of the 20th century was of the same order of magnitude as in the second half of the 20th century, even though the rate of warming was considerably smaller.
Resumo:
Results from all phases of the orbits of the Ulysses spacecraft have shown that the magnitude of the radial component of the heliospheric field is approximately independent of heliographic latitude. This result allows the use of near- Earth observations to compute the total open flux of the Sun. For example, using satellite observations of the interplanetary magnetic field, the average open solar flux was shown to have risen by 29% between 1963 and 1987 and using the aa geomagnetic index it was found to have doubled during the 20th century. It is therefore important to assess fully the accuracy of the result and to check that it applies to all phases of the solar cycle. The first perihelion pass of the Ulysses spacecraft was close to sunspot minimum, and recent data from the second perihelion pass show that the result also holds at solar maximum. The high level of correlation between the open flux derived from the various methods strongly supports the Ulysses discovery that the radial field component is independent of latitude. We show here that the errors introduced into open solar flux estimates by assuming that the heliospheric field’s radial component is independent of latitude are similar for the two passes and are of order 25% for daily values, falling to 5% for averaging timescales of 27 days or greater. We compare here the results of four methods for estimating the open solar flux with results from the first and second perehelion passes by Ulysses. We find that the errors are lowest (1–5% for averages over the entire perehelion passes lasting near 320 days), for near-Earth methods, based on either interplanetary magnetic field observations or the aa geomagnetic activity index. The corresponding errors for the Solanki et al. (2000) model are of the order of 9–15% and for the PFSS method, based on solar magnetograms, are of the order of 13–47%. The model of Solanki et al. is based on the continuity equation of open flux, and uses the sunspot number to quantify the rate of open flux emergence. It predicts that the average open solar flux has been decreasing since 1987, as Correspondence to: M. Lockwood (m.lockwood@rl.ac.uk) is observed in the variation of all the estimates of the open flux. This decline combines with the solar cycle variation to produce an open flux during the second (sunspot maximum) perihelion pass of Ulysses which is only slightly larger than that during the first (sunspot minimum) perihelion pass.
Resumo:
This study examines the atmospheric circulation patterns and surface features associated with the seven coldest winters in the U.K. since 1870, using the 20th Century Reanalysis. Six of these winters are outside the scope of previous reanalysis datasets; we examine them here for the first time. All winters show a marked lack of the climatological southwesterly flow over the UK, displaying easterly and northeasterly anomalies. Six of the seven winters (all except 1890) were associated with a negative phase of the North Atlantic Oscillation; 1890 was characterised by a blocking anticyclone over and northeast of the UK.
Resumo:
As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.
Resumo:
The intensification of agriculture and the development of synthetic insecticides enabled worldwide grain production to more than double in the last third of the 20th century. However, the heavy dependence and, in some cases, overuse of insecticides has been responsible for negative environmental and ecological impacts across the globe, such as a reduction in biodiversity, insect resistance to pesticides, negative effects on nontarget species (e.g. natural enemies) and the development of secondary pests. The use of recombinant DNA technology to develop genetically engineered (GE) insect resistant crops could mitigate many of the negative side effects of pesticides. One such genetic alteration enables crops to express toxic crystalline (Cry) proteins from the soil bacteria Bacillus thuringiensis (Bt). Despite the widespread adoption of Bt crops, there are still a range of unanswered questions concerning longer term agro-ecosystem interactions. For instance, insect species that are not susceptible to the expressed toxin can develop into secondary pests and cause significant damage to the crop. Here we review the main causes surrounding secondary pest dynamics in Bt crops and the impact of such outbreaks. Regardless of the causes, if non-susceptible secondary pest populations exceed economic thresholds, insecticide spraying could become the immediate solution at farmers’ disposal, and the sustainable use of this genetic modification technology may be in jeopardy. Based on the literature, recommendations for future research are outlined that will help to improve the knowledge of the possible longterm ecological trophic interactions of employing this technology.
Resumo:
In the 20th century, the scholarly study of human relationships both grew dramatically and simultaneously fragmented into various disciplines and subdisciplines. Although diversity of thought is generally considered helpful for the evolution of scientific fields, the value accrued from interdisciplinary discourse depends on the ability of scholars to integrate multiple perspectives and synthesize foundational works in a systematic manner. The goal of this study is to synthesize foundational theories from social and behavioral sciences that have contributed to an understanding of relationship marketing. In seeking to provide a holistic understanding of the field, we incorporate contributions from the disciplines of marketing, management, psychology, and sociology. In building on our analysis, we synthesize our findings into a conceptual model that examines the systematic dimensions of relationship marketing. The article concludes by identifying key themes for contributors to the Journal of Relationship Marketing to consider going forward.