39 resultados para Article 29 Working Group
em CentAUR: Central Archive University of Reading - UK
Resumo:
While a quantitative climate theory of tropical cyclone formation remains elusive, considerable progress has been made recently in our ability to simulate tropical cyclone climatologies and understand the relationship between climate and tropical cyclone formation. Climate models are now able to simulate a realistic rate of global tropical cyclone formation, although simulation of the Atlantic tropical cyclone climatology remains challenging unless horizontal resolutions finer than 50 km are employed. This article summarizes published research from the idealized experiments of the Hurricane Working Group of U.S. CLIVAR (CLImate VARiability and predictability of the ocean-atmosphere system). This work, combined with results from other model simulations, has strengthened relationships between tropical cyclone formation rates and climate variables such as mid-tropospheric vertical velocity, with decreased climatological vertical velocities leading to decreased tropical cyclone formation. Systematic differences are shown between experiments in which only sea surface temperature is increased versus experiments where only atmospheric carbon dioxide is increased, with the carbon dioxide experiments more likely to demonstrate the decrease in tropical cyclone numbers previously shown to be a common response of climate models in a warmer climate. Experiments where the two effects are combined also show decreases in numbers, but these tend to be less for models that demonstrate a strong tropical cyclone response to increased sea surface temperatures. Further experiments are proposed that may improve our understanding of the relationship between climate and tropical cyclone formation, including experiments with two-way interaction between the ocean and the atmosphere and variations in atmospheric aerosols.
Resumo:
The right ventricle has become an increasing focus in cardiovascular research. In this position paper, we give a brief overview of the specific pathophysiological features of the right ventricle, with particular emphasis on functional and molecular modifications as well as therapeutic strategies in chronic overload, highlighting the differences from the left ventricle. Importantly, we put together recommendations on promising topics of research in the field, experimental study design, and functional evaluation of the right ventricle in experimental models, from non-invasive methodologies to haemodynamic evaluation and ex vivo set-ups.
Resumo:
The failing heart is characterized by complex tissue remodelling involving increased cardiomyocyte death, and impairment of sarcomere function, metabolic activity, endothelial and vascular function, together with increased inflammation and interstitial fibrosis. For years, therapeutic approaches for heart failure (HF) relied on vasodilators and diuretics which relieve cardiac workload and HF symptoms. The introduction in the clinic of drugs interfering with beta-adrenergic and angiotensin signalling have ameliorated survival by interfering with the intimate mechanism of cardiac compensation. Current therapy, though, still has a limited capacity to restore muscle function fully, and the development of novel therapeutic targets is still an important medical need. Recent progress in understanding the molecular basis of myocardial dysfunction in HF is paving the way for development of new treatments capable of restoring muscle function and targeting specific pathological subsets of LV dysfunction. These include potentiating cardiomyocyte contractility, increasing cardiomyocyte survival and adaptive hypertrophy, increasing oxygen and nutrition supply by sustaining vessel formation, and reducing ventricular stiffness by favourable extracellular matrix remodelling. Here, we consider drugs such as omecamtiv mecarbil, nitroxyl donors, cyclosporin A, SERCA2a (sarcoplasmic/endoplasmic Ca(2 +) ATPase 2a), neuregulin, and bromocriptine, all of which are currently in clinical trials as potential HF therapies, and discuss novel molecular targets with potential therapeutic impact that are in the pre-clinical phases of investigation. Finally, we consider conceptual changes in basic science approaches to improve their translation into successful clinical applications.
Resumo:
Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.
Resumo:
The Madden–Julian oscillation (MJO) interacts with and influences a wide range of weather and climate phenomena (e.g., monsoons, ENSO, tropical storms, midlatitude weather), and represents an important, and as yet unexploited, source of predictability at the subseasonal time scale. Despite the important role of the MJO in climate and weather systems, current global circulation models (GCMs) exhibit considerable shortcomings in representing this phenomenon. These shortcomings have been documented in a number of multimodel comparison studies over the last decade. However, diagnosis of model performance has been challenging, and model progress has been difficult to track, because of the lack of a coherent and standardized set of MJO diagnostics. One of the chief objectives of the U.S. Climate Variability and Predictability (CLIVAR) MJO Working Group is the development of observation-based diagnostics for objectively evaluating global model simulations of the MJO in a consistent framework. Motivation for this activity is reviewed, and the intent and justification for a set of diagnostics is provided, along with specification for their calculation, and illustrations of their application. The diagnostics range from relatively simple analyses of variance and correlation to more sophisticated space–time spectral and empirical orthogonal function analyses. These diagnostic techniques are used to detect MJO signals, to construct composite life cycles, to identify associations of MJO activity with the mean state, and to describe interannual variability of the MJO.
Resumo:
A review of the implications of climate change for freshwater resources, based on Chapter 4 of Working Group 2, IPCC.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
In November 2008, a group of scientists met at the 6th Meeting of the International Scientific Association of Probiotics and Prebiotics (ISAPP) in London, Ontario, Canada, to discuss the functionality of prebiotics. As a result of this, it was concluded that the prebiotic field is currently dominated by gastrointestinal events. However, in the future, it may be the case that other mixed microbial ecosystems may be modulated by a prebiotic approach, such as the oral cavity, skin and the urogenital tract. Therefore, a decision was taken to build upon the current prebiotic status and define a niche for ‘dietary prebiotics’. This review is co-authored by the working group of ISAPP scientists and sets the background for defining a dietary prebiotic as ‘‘a selectively fermented ingredient that results in specific changes in the composition and/or activity of the gastrointestinal microbiota, thus conferring benefit(s) upon host health’’.
Resumo:
This paper presents single-column model (SCM) simulations of a tropical squall-line case observed during the Coupled Ocean-Atmosphere Response Experiment of the Tropical Ocean/Global Atmosphere Programme. This case-study was part of an international model intercomparison project organized by Working Group 4 ‘Precipitating Convective Cloud Systems’ of the GEWEX (Global Energy and Water-cycle Experiment) Cloud System Study. Eight SCM groups using different deep-convection parametrizations participated in this project. The SCMs were forced by temperature and moisture tendencies that had been computed from a reference cloud-resolving model (CRM) simulation using open boundary conditions. The comparison of the SCM results with the reference CRM simulation provided insight into the ability of current convection and cloud schemes to represent organized convection. The CRM results enabled a detailed evaluation of the SCMs in terms of the thermodynamic structure and the convective mass flux of the system, the latter being closely related to the surface convective precipitation. It is shown that the SCMs could reproduce reasonably well the time evolution of the surface convective and stratiform precipitation, the convective mass flux, and the thermodynamic structure of the squall-line system. The thermodynamic structure simulated by the SCMs depended on how the models partitioned the precipitation between convective and stratiform. However, structural differences persisted in the thermodynamic profiles simulated by the SCMs and the CRM. These differences could be attributed to the fact that the total mass flux used to compute the SCM forcing differed from the convective mass flux. The SCMs could not adequately represent these organized mesoscale circulations and the microphysicallradiative forcing associated with the stratiform region. This issue is generally known as the ‘scale-interaction’ problem that can only be properly addressed in fully three-dimensional simulations. Sensitivity simulations run by several groups showed that the time evolution of the surface convective precipitation was considerably smoothed when the convective closure was based on convective available potential energy instead of moisture convergence. Finally, additional SCM simulations without using a convection parametrization indicated that the impact of a convection parametrization in forced SCM runs was more visible in the moisture profiles than in the temperature profiles because convective transport was particularly important in the moisture budget.
Resumo:
New reconstructions of changing vegetation patterns in the Mediterranean-Black Sea Corridor since the Last Glacial Maximum are being produced by an improved biomisation scheme that uses both pollen and plant macrofossil data, in conjunction. Changes in fire regimes over the same interval will also be reconstructed using both microscopic and macroscopic charcoal remains. These reconstructions will allow a diagnosis of the interactions between climate, fire and vegetation on millennial timescales, and will also help to clarify the role of coastline and other geomorphic changes, salinity and impacts of human activities in this region. These new data sets are being produced as a result of collaboration between the Palynology Working Group (WG-2) within the IGCP-521 project and the international Palaeovegetation Mapping Project (BIOME 6000). The main objective of this paper is to present the goals of this cooperation, methodology, including limitations and planned improvements, and to show the initial results of some applications.
Resumo:
ESA’s first multi-satellite mission Cluster is unique in its concept of 4 satellites orbiting in controlled formations. This will give an unprecedented opportunity to study structure and dynamics of the magnetosphere. In this paper we discuss ways in which ground-based remote-sensing observations of the ionosphere can be used to support the multipoint in-situ satellite measurements. There are a very large number of potentially useful configurations between the satellites and any one ground-based observatory; however, the number of ideal occurrences for any one configuration is low. Many of the ground-based instruments cannot operate continuously and Cluster will take data only for a part of each orbit, depending on how much high-resolution (‘burst-mode’) data are acquired. In addition, there are a great many instrument modes and the formation, size and shape of the cluster of the four satellites to consider. These circumstances create a clear and pressing need for careful planning to ensure that the scientific return from Cluster is maximised by additional coordinated ground-based observations. For this reason, ESA established a working group to coordinate the observations on the ground with Cluster. We will give a number of examples how the combined spacecraft and ground-based observations can address outstanding questions in magnetospheric physics. An online computer tool has been prepared to allow for the planning of conjunctions and advantageous constellations between the Cluster spacecraft and individual or combined ground-based systems. During the mission a ground-based database containing index and summary data will help to identify interesting datasets and allow to select intervals for coordinated studies. We illustrate the philosophy of our approach, using a few important examples of the many possible configurations between the satellite and the ground-based instruments.
Resumo:
The Working Group II contribution to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change critically reviewed and assessed tens of thousands of recent publications to inform about the assess current scientific knowledge on climate change impacts, vulnerability and adaptation. Chapter 3 of the report focuses on freshwater resources, but water issues are also prominent in other sectoral chapters and in the regional chapters of the Working Group II report as well as in various chapters of Working Group I. With this paper, the lead authors, a review editor and the chapter scientist of the freshwater chapter of the WGII AR5 wish to summarize their assessment of the most relevant risks of climate change related to freshwater systems and to show how assessment and reduction of those risks can be integrated into water management.
Resumo:
A recent intercomparison exercise proposed by the Working Group for Numerical Experimentation (WGNE) revealed that the parameterized, or unresolved, surface stress in weather forecast models is highly model-dependent, especially over orography. Models of comparable resolution differ over land by as much as 20% in zonal mean total subgrid surface stress (Ttot). The way Ttot is partitioned between the different parameterizations is also model-dependent. In this study, we simulated in a particular model an increase in Ttot comparable with the spread found in the WGNE intercomparison. This increase was simulated in two ways, namely by increasing independently the contributions to Ttot of the turbulent orographic form drag scheme (TOFD) and of the orographic low-level blocking scheme (BLOCK). Increasing the parameterized orographic drag leads to significant changes in surface pressure, zonal wind and temperature in the Northern Hemisphere during winter both in 10 day weather forecasts and in seasonal integrations. However, the magnitude of these changes in circulation strongly depends on which scheme is modified. In 10 day forecasts, stronger changes are found when the TOFD stress is increased, while on seasonal time scales the effects are of comparable magnitude, although different in detail. At these time scales, the BLOCK scheme affects the lower stratosphere winds through changes in the resolved planetary waves which are associated with surface impacts, while the TOFD effects are mostly limited to the lower troposphere. The partitioning of Ttot between the two schemes appears to play an important role at all time scales.