11 resultados para Rex

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a descriptive overview of the meteorology in the south eastern subtropical Pacific (SEP) during the VOCALS-REx intensive observations campaign which was carried out between October and November 2008. Mainly based on data from operational analyses, forecasts, reanalysis, and satellite observations, we focus on spatio-temporal scales from synoptic to planetary. A climatological context is given within which the specific conditions observed during the campaign are placed, with particular reference to the relationships between the large-scale and the regional circulations. The mean circulations associated with the diurnal breeze systems are also discussed. We then provide a summary of the day-to-day synoptic-scale circulation, air-parcel trajectories, and cloud cover in the SEP during VOCALS-REx. Three meteorologically distinct periods of time are identified and the large-scale causes for their different character are discussed. The first period was characterised by significant variability associated with synoptic-scale systems interesting the SEP; while the two subsequent phases were affected by planetary-scale disturbances with a slower evolution. The changes between initial and later periods can be partly explained from the regular march of the annual cycle, but contributions from subseasonal variability and its teleconnections were important. Across the whole of the two months under consideration we find a significant correlation between the depth of the inversion-capped marine boundary layer (MBL) and the amount of low cloud in the area of study. We discuss this correlation and argue that at least as a crude approximation a typical scaling may be applied relating MBL and cloud properties with the large-scale parameters of SSTs and tropospheric temperatures. These results are consistent with previously found empirical relationships involving lower-tropospheric stability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the event of a release of toxic gas in the center of London, the emergency services would need to determine quickly the extent of the area contaminated. The transport of pollutants by turbulent flow within the complex street and building architecture of cities is not straightforward, and we might wonder whether it is at all possible to make a scientifically-reasoned decision. Here we describe recent progress from a major UK project, ‘Dispersion of Air Pollution and its Penetration into the Local Environment’ (DAPPLE, www.dapple.org.uk). In DAPPLE, we focus on the movement of airborne pollutants in cities by developing a greater understanding of atmospheric flow and dispersion within urban street networks. In particular, we carried out full-scale dispersion experiments in central London (UK) during 2003, 2004, 2007, and 2008 to address the extent of the dispersion of tracers following their release at street level. These measurements complemented previous studies because (i) our focus was on dispersion within the first kilometer from the source, when most of the material was expected to remain within the street network rather than being mixed into the boundary layer aloft, (ii) measurements were made under a wide variety of meteorological conditions, and (iii) central London represents a European, rather than North American, city geometry. Interpretation of the results from the full-scale experiments was supported by extensive numerical and wind tunnel modeling, which allowed more detailed analysis under idealized and controlled conditions. In this article, we review the full-scale DAPPLE methodologies and show early results from the analysis of the 2007 field campaign data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rodenticides with delayed action are generally more effective than fast-acting compounds because of the phenomenon of bait shyness. Calciferols have a stop-feed effect quite soon after dosing, and physiological effects are measurable within one day of dosing. We investigated whether bait shyness might result from these fairly rapid effects in the laboratory rat. We found evidence of bait shyness following recovery from sub-lethal dosing with two forms of calciferol. Use of intubation as well as feeding showed that the response was to the bait carrier rather than to detection of calciferols per se.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of 18 coupled Chemistry Climate Models (CCMs) in the Tropical Tropopause Layer (TTL) is evaluated using qualitative and quantitative diagnostics. Trends in tropopause quantities in the tropics and the extratropical Upper Troposphere and Lower Stratosphere (UTLS) are analyzed. A quantitative grading methodology for evaluating CCMs is extended to include variability and used to develop four different grades for tropical tropopause temperature and pressure, water vapor and ozone. Four of the 18 models and the multi-model mean meet quantitative and qualitative standards for reproducing key processes in the TTL. Several diagnostics are performed on a subset of the models analyzing the Tropopause Inversion Layer (TIL), Lagrangian cold point and TTL transit time. Historical decreases in tropical tropopause pressure and decreases in water vapor are simulated, lending confidence to future projections. The models simulate continued decreases in tropopause pressure in the 21st century, along with ∼1K increases per century in cold point tropopause temperature and 0.5–1 ppmv per century increases in water vapor above the tropical tropopause. TTL water vapor increases below the cold point. In two models, these trends are associated with 35% increases in TTL cloud fraction. These changes indicate significant perturbations to TTL processes, specifically to deep convective heating and humidity transport. Ozone in the extratropical lowermost stratosphere has significant and hemispheric asymmetric trends. O3 is projected to increase by nearly 30% due to ozone recovery in the Southern Hemisphere (SH) and due to enhancements in the stratospheric circulation. These UTLS ozone trends may have significant effects in the TTL and the troposphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evaluating CCMs with the presented framework will increase our confidence in predictions of stratospheric ozone change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulations of ozone loss rates using a three-dimensional chemical transport model and a box model during recent Antarctic and Arctic winters are compared with experimental loss rates. The study focused on the Antarctic winter 2003, during which the first Antarctic Match campaign was organized, and on Arctic winters 1999/2000, 2002/2003. The maximum ozone loss rates retrieved by the Match technique for the winters and levels studied reached 6 ppbv/sunlit hour and both types of simulations could generally reproduce the observations at 2-sigma error bar level. In some cases, for example, for the Arctic winter 2002/2003 at 475 K level, an excellent agreement within 1-sigma standard deviation level was obtained. An overestimation was also found with the box model simulation at some isentropic levels for the Antarctic winter and the Arctic winter 1999/2000, indicating an overestimation of chlorine activation in the model. Loss rates in the Antarctic show signs of saturation in September, which have to be considered in the comparison. Sensitivity tests were performed with the box model in order to assess the impact of kinetic parameters of the ClO-Cl2O2 catalytic cycle and total bromine content on the ozone loss rate. These tests resulted in a maximum change in ozone loss rates of 1.2 ppbv/sunlit hour, generally in high solar zenith angle conditions. In some cases, a better agreement was achieved with fastest photolysis of Cl2O2 and additional source of total inorganic bromine but at the expense of overestimation of smaller ozone loss rates derived later in the winter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a flexible framework to calculate the optical properties of atmospheric aerosols at a given relative humidity based on their composition and size distribution. The similarity of this framework to climate model parameterisations allows rapid and extensive sensitivity tests of the impact of uncertainties in data or of new measurements on climate relevant aerosol properties. The data collected by the FAAM BAe-146 aircraft during the EUCAARI-LONGREX and VOCALS-REx campaigns have been used in a closure study to analyse the agreement between calculated and measured aerosol optical properties for two very different aerosol types. The agreement achieved for the EUCAARI-LONGREX flights is within the measurement uncertainties for both scattering and absorption. However, there is poor agreement between the calculated and the measured scattering for the VOCALS-REx flights. The high concentration of sulphate, which is a scattering aerosol with no absorption in the visible spectrum, made the absorption measurements during VOCALS-REx unreliable, and thus no closure study was possible for the absorption. The calculated hygroscopic scattering growth factor overestimates the measured values during EUCAARI-LONGREX and VOCALS-REx by ∼30% and ∼20%, respectively. We have also tested the sensitivity of the calculated aerosol optical properties to the uncertainties in the refractive indices, the hygroscopic growth factors and the aerosol size distribution. The largest source of uncertainty in the calculated scattering is the aerosol size distribution (∼35%), followed by the assumed hygroscopic growth factor for organic aerosol (∼15%), while the predominant source of uncertainty in the calculated absorption is the refractive index of organic aerosol (28–60%), although we would expect the refractive index of black carbon to be important for aerosol with a higher black carbon fraction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Systematic review (SR) is a rigorous, protocol-driven approach designed to minimise error and bias when summarising the body of research evidence relevant to a specific scientific question. Taking as a comparator the use of SR in synthesising research in healthcare, we argue that SR methods could also pave the way for a “step change” in the transparency, objectivity and communication of chemical risk assessments (CRA) in Europe and elsewhere. We suggest that current controversies around the safety of certain chemicals are partly due to limitations in current CRA procedures which have contributed to ambiguity about the health risks posed by these substances. We present an overview of how SR methods can be applied to the assessment of risks from chemicals, and indicate how challenges in adapting SR methods from healthcare research to the CRA context might be overcome. Regarding the latter, we report the outcomes from a workshop exploring how to increase uptake of SR methods, attended by experts representing a wide range of fields related to chemical toxicology, risk analysis and SR. Priorities which were identified include: the conduct of CRA-focused prototype SRs; the development of a recognised standard of reporting and conduct for SRs in toxicology and CRA; and establishing a network to facilitate research, communication and training in SR methods. We see this paper as a milestone in the creation of a research climate that fosters communication between experts in CRA and SR and facilitates wider uptake of SR methods into CRA.