185 resultados para Topics of global scope
Resumo:
The importance of aerosol emissions for near term climate projections is investigated by analysing simulations with the HadGEM2-ES model under two different emissions scenarios: RCP2.6 and RCP4.5. It is shown that the near term warming projected under RCP2.6 is greater than under RCP4.5, even though the greenhouse gas forcing is lower. Rapid and substantial reductions in sulphate aerosol emissions due to a reduction of coal burning in RCP2.6 lead to a reduction in the negative shortwave forcing due to aerosol direct and indirect effects. Indirect effects play an important role over the northern hemisphere oceans, especially the subtropical northeastern Pacific where an anomaly of 5-10\,Wm$^{-2}$ develops. The pattern of surface temperature change is consistent with the expected response to this surface radiation anomaly, whilst also exhibiting features that reflect redistribution of energy, and feedbacks, within the climate system. These results demonstrate the importance of aerosol emissions as a key source of uncertainty in near term projections of global and regional climate.
Resumo:
The Intergovernmental Panel on Climate Change fourth assessment report, published in 2007 came to a more confident assessment of the causes of global temperature change than previous reports and concluded that ‘it is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent except Antarctica.’ Since then, warming over Antarctica has also been attributed to human influence, and further evidence has accumulated attributing a much wider range of climate changes to human activities. Such changes are broadly consistent with theoretical understanding, and climate model simulations, of how the planet is expected to respond. This paper reviews this evidence from a regional perspective to reflect a growing interest in understanding the regional effects of climate change, which can differ markedly across the globe. We set out the methodological basis for detection and attribution and discuss the spatial scales on which it is possible to make robust attribution statements. We review the evidence showing significant human-induced changes in regional temperatures, and for the effects of external forcings on changes in the hydrological cycle, the cryosphere, circulation changes, oceanic changes, and changes in extremes. We then discuss future challenges for the science of attribution. To better assess the pace of change, and to understand more about the regional changes to which societies need to adapt, we will need to refine our understanding of the effects of external forcing and internal variability
Resumo:
Oxygen isotope records of stalagmites from China and Oman reveal a weak summer monsoon event, with a double-plunging structure, that started 8.21 ± 0.02 kyr B.P. An identical but antiphased pattern is also evident in two stalagmite records from eastern Brazil, indicating that the South American Summer Monsoon was intensified during the 8.2 kyr B.P. event. These records demonstrate that the event was of global extent and synchronous within dating errors of <50 years. In comparison with recent model simulations, it is plausible that the 8.2 kyr B.P. event can be tied in changes of the Atlantic Meridional Overturning Circulation triggered by a glacial lake draining event. This, in turn, affected North Atlantic climate and latitudinal position of the Intertropical Convergence Zone, resulting in the observed low-latitude monsoonal precipitation patterns.
Resumo:
Large, well-documented wildfires have recently generated worldwide attention, and raised concerns about the impacts of humans and climate change on wildfire regimes. However, comparatively little is known about the patterns and driving forces of global fire activity before the twentieth century. Here we compile sedimentary charcoal records spanning six continents to document trends in both natural and anthropogenic biomass burning for the past two millennia. We find that global biomass burning declined from AD 1 to 1750, before rising sharply between 1750 and 1870. Global burning then declined abruptly after 1870. The early decline in biomass burning occurred in concert with a global cooling trend and despite a rise in the human population. We suggest the subsequent rise was linked to increasing human influences, such as population growth and land-use changes. Our compilation suggests that the final decline occurred despite increasing air temperatures and population. We attribute this reduction in the amount of biomass burned over the past 150 years to the global expansion of intensive grazing, agriculture and fire management.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The efficiency with which the oceans take up heat has a significant influence on the rate of global warming. Warming of the ocean above 700 m over the past few decades has been well documented. However, most of the ocean lies below 700 m. Here we analyse observations of heat uptake into the deep North Atlantic. We find that the extratropical North Atlantic as a whole warmed by 1.45±0.5×1022 J between 1955 and 2005, but Lower North Atlantic Deep Water cooled, most likely as an adjustment from an early twentieth-century warm period. In contrast, the heat content of Upper North Atlantic Deep Water exhibited strong decadal variability. We demonstrate and quantify the importance of density-compensated temperature anomalies for long-term heat uptake into the deep North Atlantic. These anomalies form in the subpolar gyre and propagate equatorwards. High salinity in the subpolar gyre is a key requirement for this mechanism. In the past 50 years, suitable conditions have occurred only twice: first during the 1960s and again during the past decade. We conclude that heat uptake through density-compensated temperature anomalies will contribute to deep ocean heat uptake in the near term. In the longer term, the importance of this mechanism will be determined by competition between the multiple processes that influence subpolar gyre salinity in a changing climate.
Resumo:
Different models for the electronic structure of carbon monoxide are suggested in influential textbooks. Therefore, this electronic structure offers an interesting subject in teaching because it can be used as an example to relate seemingly conflicting concepts. Understanding the connections between ostensibly different methods and between different concepts, related or conflicting, is important in academic studies. The related reactivities of CO, O2, and N-2 and the notations of molecular orbitals are topics of interest and are discussed in detail.
Resumo:
A synthesis of global climate model results and inferences from proxy records suggests an increased sea surface temperature gradient between the tropical Indian and Pacific Oceans during medieval times.
Resumo:
Radiative forcing is a useful tool for predicting equilibrium global temperature change. However, it is not so useful for predicting global precipitation changes, as changes in precipitation strongly depend on the climate change mechanism and how it perturbs the atmospheric and surface energy budgets. Here a suite of climate model experiments and radiative transfer calculations are used to quantify and assess this dependency across a range of climate change mechanisms. It is shown that the precipitation response can be split into two parts: a fast atmospheric response that strongly correlates with the atmospheric component of radiative forcing, and a slower response to global surface temperature change that is independent of the climate change mechanism, ∼2-3% per unit of global surface temperature change. We highlight the precipitation response to black carbon aerosol forcing as falling within this range despite having an equilibrium response that is of opposite sign to the radiative forcing and global temperature change.
Resumo:
The extent and thickness of the Arctic sea ice cover has decreased dramatically in the past few decades with minima in sea ice extent in September 2007 and 2011 and climate models did not predict this decline. One of the processes poorly represented in sea ice models is the formation and evolution of melt ponds. Melt ponds form on Arctic sea ice during the melting season and their presence affects the heat and mass balances of the ice cover, mainly by decreasing the value of the surface albedo by up to 20%. We have developed a melt pond model suitable for forecasting the presence of melt ponds based on sea ice conditions. This model has been incorporated into the Los Alamos CICE sea ice model, the sea ice component of several IPCC climate models. Simulations for the period 1990 to 2007 are in good agreement with observed ice concentration. In comparison to simulations without ponds, the September ice volume is nearly 40% lower. Sensitivity studies within the range of uncertainty reveal that, of the parameters pertinent to the present melt pond parameterization and for our prescribed atmospheric and oceanic forcing, variations of optical properties and the amount of snowfall have the strongest impact on sea ice extent and volume. We conclude that melt ponds will play an increasingly important role in the melting of the Arctic ice cover and their incorporation in the sea ice component of Global Circulation Models is essential for accurate future sea ice forecasts.
Resumo:
We compare the strategies of manufacturing and service multinational enterprise (MNE) subsidiaries in South East Asia to investigate whether they follow global versus regional strategy. We examine foreign direct investment (FDI) motives, types of FDI, product and service offerings, and sales strategies of these two groups. Using a unique primary data set of 101 British MNE subsidiaries in six South East Asian countries over the five-year period (2003–2007), we find that manufacturing and service subsidiaries pursue regional strategies. Both groups have a strong regional focus in their sales. We explore the possible reasons for the relative lack of global strategy of these subsidiaries.
Resumo:
In order to influence global policy effectively, conservation scientists need to be able to provide robust predictions of the impact of alternative policies on biodiversity and measure progress towards goals using reliable indicators. We present a framework for using biodiversity indicators predictively to inform policy choices at a global level. The approach is illustrated with two case studies in which we project forwards the impacts of feasible policies on trends in biodiversity and in relevant indicators. The policies are based on targets agreed at the Convention on Biological Diversity (CBD) meeting in Nagoya in October 2010. The first case study compares protected area policies for African mammals, assessed using the Red List Index; the second example uses the Living Planet Index to assess the impact of a complete halt, versus a reduction, in bottom trawling. In the protected areas example, we find that the indicator can aid in decision-making because it is able to differentiate between the impacts of the different policies. In the bottom trawling example, the indicator exhibits some counter-intuitive behaviour, due to over-representation of some taxonomic and functional groups in the indicator, and contrasting impacts of the policies on different groups caused by trophic interactions. Our results support the need for further research on how to use predictive models and indicators to credibly track trends and inform policy. To be useful and relevant, scientists must make testable predictions about the impact of global policy on biodiversity to ensure that targets such as those set at Nagoya catalyse effective and measurable change.
Resumo:
In the last decade, a vast number of land surface schemes has been designed for use in global climate models, atmospheric weather prediction, mesoscale numerical models, ecological models, and models of global changes. Since land surface schemes are designed for different purposes they have various levels of complexity in the treatment of bare soil processes, vegetation, and soil water movement. This paper is a contribution to a little group of papers dealing with intercomparison of differently designed and oriented land surface schemes. For that purpose we have chosen three schemes for classification: i) global climate models, BATS (Dickinson et al., 1986; Dickinson et al., 1992); ii) mesoscale and ecological models, LEAF (Lee, 1992) and iii) mesoscale models, LAPS (Mihailović, 1996; Mihailović and Kallos, 1997; Mihailović et al., 1999) according to the Shao et al. (1995) classification. These schemes were compared using surface fluxes and leaf temperature outputs obtained by time integrations of data sets derived from the micrometeorological measurements above a maize field at an experimental site in De Sinderhoeve (The Netherlands) for 18 August, 8 September, and 4 October 1988. Finally, comparison of the schemes was supported applying a simple statistical analysis on the surface flux outputs.
Resumo:
This conference was an unusual and interesting event. Celebrating 25 years of Construction Management and Economics provides us with an opportunity to reflect on the research that has been reported over the years, to consider where we are now, and to think about the future of academic research in this area. Hence the sub-title of this conference: “past, present and future”. Looking through these papers, some things are clear. First, the range of topics considered interesting has expanded hugely since the journal was first published. Second, the research methods are also more diverse. Third, the involvement of wider groups of stakeholder is evident. There is a danger that this might lead to dilution of the field. But my instinct has always been to argue against the notion that Construction Management and Economics represents a discipline, as such. Granted, there are plenty of university departments around the world that would justify the idea of a discipline. But the vast majority of academic departments who contribute to the life of this journal carry different names to this. Indeed, the range and breadth of methodological approaches to the research reported in Construction Management and Economics indicates that there are several different academic disciplines being brought to bear on the construction sector. Some papers are based on economics, some on psychology and others on operational research, sociology, law, statistics, information technology, and so on. This is why I maintain that construction management is not an academic discipline, but a field of study to which a range of academic disciplines are applied. This may be why it is so interesting to be involved in this journal. The problems to which the papers are applied develop and grow. But the broad topics of the earliest papers in the journal are still relevant today. What has changed a lot is our interpretation of the problems that confront the construction sector all over the world, and the methodological approaches to resolving them. There is a constant difficulty in dealing with topics as inherently practical as these. While the demands of the academic world are driven by the need for the rigorous application of sound methods, the demands of the practical world are quite different. It can be difficult to meet the needs of both sets of stakeholders at the same time. However, increasing numbers of postgraduate courses in our area result in larger numbers of practitioners with a deeper appreciation of what research is all about, and how to interpret and apply the lessons from research. It also seems that there are contributions coming not just from construction-related university departments, but also from departments with identifiable methodological traditions of their own. I like to think that our authors can publish in journals beyond the construction-related areas, to disseminate their theoretical insights into other disciplines, and to contribute to the strength of this journal by citing our articles in more mono-disciplinary journals. This would contribute to the future of the journal in a very strong and developmental way. The greatest danger we face is in excessive self-citation, i.e. referring only to sources within the CM&E literature or, worse, referring only to other articles in the same journal. The only way to ensure a strong and influential position for journals and university departments like ours is to be sure that our work is informing other academic disciplines. This is what I would see as the future, our logical next step. If, as a community of researchers, we are not producing papers that challenge and inform the fundamentals of research methods and analytical processes, then no matter how practically relevant our output is to the industry, it will remain derivative and secondary, based on the methodological insights of others. The balancing act between methodological rigour and practical relevance is a difficult one, but not, of course, a balance that has to be struck in every single paper.
Resumo:
At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.