53 resultados para Grid simulations


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The characteristics of convectively-generated gravity waves during an episode of deep convection near the coast of Wales are examined in both high resolution mesoscale simulations [with the (UK) Met Oce Unified Model] and in observations from a Mesosphere-Stratosphere-Troposphere (MST) wind profiling Doppler radar. Deep convection reached the tropopause and generated vertically propagating, high frequency waves in the lower stratosphere that produced vertical velocity perturbations O(1 m/s). Wavelet analysis is applied in order to determine the characteristic periods and wavelengths of the waves. In both the simulations and observations, the wavelet spectra contain several distinct preferred scales indicated by multiple spectral peaks. The peaks are most pronounced in the horizontal spectra at several wavelengths less than 50 km. Although these peaks are most clear and of largest amplitude in the highest resolution simulations (with 1 km horizontal grid length), they are also evident in coarser simulations (with 4 km horizontal grid length). Peaks also exist in the vertical and temporal spectra (between approximately 2.5 and 4.5 km, and 10 to 30 minutes, respectively) with good agreement between simulation and observation. Two-dimensional (wavenumber-frequency) spectra demonstrate that each of the selected horizontal scales contains peaks at each of preferred temporal scales revealed by the one- dimensional spectra alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainties associated with the representation of various physical processes in global climate models (GCMs) mean that, when projections from GCMs are used in climate change impact studies, the uncertainty propagates through to the impact estimates. A complete treatment of this ‘climate model structural uncertainty’ is necessary so that decision-makers are presented with an uncertainty range around the impact estimates. This uncertainty is often underexplored owing to the human and computer processing time required to perform the numerous simulations. Here, we present a 189-member ensemble of global river runoff and water resource stress simulations that adequately address this uncertainty. Following several adaptations and modifications, the ensemble creation time has been reduced from 750 h on a typical single-processor personal computer to 9 h of high-throughput computing on the University of Reading Campus Grid. Here, we outline the changes that had to be made to the hydrological impacts model and to the Campus Grid, and present the main results. We show that, although there is considerable uncertainty in both the magnitude and the sign of regional runoff changes across different GCMs with climate change, there is much less uncertainty in runoff changes for regions that experience large runoff increases (e.g. the high northern latitudes and Central Asia) and large runoff decreases (e.g. the Mediterranean). Furthermore, there is consensus that the percentage of the global population at risk to water resource stress will increase with climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process-based integrated modelling of weather and crop yield over large areas is becoming an important research topic. The production of the DEMETER ensemble hindcasts of weather allows this work to be carried out in a probabilistic framework. In this study, ensembles of crop yield (groundnut, Arachis hypogaea L.) were produced for 10 2.5 degrees x 2.5 degrees grid cells in western India using the DEMETER ensembles and the general large-area model (GLAM) for annual crops. Four key issues are addressed by this study. First, crop model calibration methods for use with weather ensemble data are assessed. Calibration using yield ensembles was more successful than calibration using reanalysis data (the European Centre for Medium-Range Weather Forecasts 40-yr reanalysis, ERA40). Secondly, the potential for probabilistic forecasting of crop failure is examined. The hindcasts show skill in the prediction of crop failure, with more severe failures being more predictable. Thirdly, the use of yield ensemble means to predict interannual variability in crop yield is examined and their skill assessed relative to baseline simulations using ERA40. The accuracy of multi-model yield ensemble means is equal to or greater than the accuracy using ERA40. Fourthly, the impact of two key uncertainties, sowing window and spatial scale, is briefly examined. The impact of uncertainty in the sowing window is greater with ERA40 than with the multi-model yield ensemble mean. Subgrid heterogeneity affects model accuracy: where correlations are low on the grid scale, they may be significantly positive on the subgrid scale. The implications of the results of this study for yield forecasting on seasonal time-scales are as follows. (i) There is the potential for probabilistic forecasting of crop failure (defined by a threshold yield value); forecasting of yield terciles shows less potential. (ii) Any improvement in the skill of climate models has the potential to translate into improved deterministic yield prediction. (iii) Whilst model input uncertainties are important, uncertainty in the sowing window may not require specific modelling. The implications of the results of this study for yield forecasting on multidecadal (climate change) time-scales are as follows. (i) The skill in the ensemble mean suggests that the perturbation, within uncertainty bounds, of crop and climate parameters, could potentially average out some of the errors associated with mean yield prediction. (ii) For a given technology trend, decadal fluctuations in the yield-gap parameter used by GLAM may be relatively small, implying some predictability on those time-scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transport of stratospheric air into the troposphere within deep convection was investigated using the Met Office Unified Model version 6.1. Three cases were simulated in which convective systems formed over the UK in the summer of 2005. For each of these three cases, simulations were performed on a grid having 4 km horizontal grid spacing in which the convection was parameterized and on a grid having 1 km horizontal grid spacing, which permitted explicit representation of the largest energy-containing scales of deep convection. Cross-tropopause transport was diagnosed using passive tracers that were initialized above the dynamically defined tropopause (2 potential vorticity unit surface) with a mixing ratio of 1. Although the synoptic-scale environment and triggering mechanisms varied between the cases, the total simulated transport was similar in all three cases. The total stratosphere-to-troposphere transport over the lifetime of the convective systems ranged from 25 to 100 kg/m2 across the simulated convective systems and resolutions, which corresponds to ∼5–20% of the total mass located within a stratospheric column extending 2 km above the tropopause. In all simulations, the transport into the lower troposphere (defined as below 3.5 km elevation) accounted for ∼1% of the total transport across the tropopause. In the 4 km runs most of the transport was due to parameterized convection, whereas in the 1 km runs the transport was due to explicitly resolved convection. The largest difference between the simulations with different resolutions occurred in the one case of midlevel convection considered, in which the total transport in the 1 km grid spacing simulation with explicit convection was 4 times that in the 4 km grid spacing simulation with parameterized convection. Although the total cross-tropopause transport was similar, stratospheric tracer was deposited more deeply to near-surface elevations in the convection-parameterizing simulations than in convection-permitting simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing awareness of protein folding disorders, the explosion of genomic information, and the need for efficient ways to predict protein structure, protein folding and unfolding has become a central issue in molecular sciences research. Molecular dynamics computer simulations are increasingly employed to understand the folding and unfolding of proteins. Running protein unfolding simulations is computationally expensive and finding ways to enhance performance is a grid issue on its own. However, more and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. This paper describes efforts to provide a grid-enabled data warehouse for protein unfolding data. We outline the challenge and present first results in the design and implementation of the data warehouse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform simulations of several convective events over the southern UK with the Met Office Unified Model (UM) at horizontal grid lengths ranging from 1.5 km to 200 m. Comparing the simulated storms on these days with the Met Office rainfall radar network allows us to apply a statistical approach to evaluate the properties and evolution of the simulated storms over a range of conditions. Here we present results comparing the storm morphology in the model and reality which show that the simulated storms become smaller as grid length decreases and that the grid length that fits the observations best changes with the size of the observed cells. We investigate the sensitivity of storm morphology in the model to the mixing length used in the subgrid turbulence scheme. As the subgrid mixing length is decreased, the number of small storms with high area-averaged rain rates increases. We show that by changing the mixing length we can produce a lower resolution simulation that produces similar morphologies to a higher resolution simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use a stratosphere–troposphere composition–climate model with interactive sulfur chemistry and aerosol microphysics, to investigate the effect of the 1991 Mount Pinatubo eruption on stratospheric aerosol properties. Satellite measurements indicate that shortly after the eruption, between 14 and 23 Tg of SO2 (7 to 11.5 Tg of sulfur) was present in the tropical stratosphere. Best estimates of the peak global stratospheric aerosol burden are in the range 19 to 26 Tg, or 3.7 to 6.7 Tg of sulfur assuming a composition of between 59 and 77 % H2SO4. In light of this large uncertainty range, we performed two main simulations with 10 and 20 Tg of SO2 injected into the tropical lower stratosphere. Simulated stratospheric aerosol properties through the 1991 to 1995 period are compared against a range of available satellite and in situ measurements. Stratospheric aerosol optical depth (sAOD) and effective radius from both simulations show good qualitative agreement with the observations, with the timing of peak sAOD and decay timescale matching well with the observations in the tropics and mid-latitudes. However, injecting 20 Tg gives a factor of 2 too high stratospheric aerosol mass burden compared to the satellite data, with consequent strong high biases in simulated sAOD and surface area density, with the 10 Tg injection in much better agreement. Our model cannot explain the large fraction of the injected sulfur that the satellite-derived SO2 and aerosol burdens indicate was removed within the first few months after the eruption. We suggest that either there is an additional alternative loss pathway for the SO2 not included in our model (e.g. via accommodation into ash or ice in the volcanic cloud) or that a larger proportion of the injected sulfur was removed via cross-tropopause transport than in our simulations. We also critically evaluate the simulated evolution of the particle size distribution, comparing in detail to balloon-borne optical particle counter (OPC) measurements from Laramie, Wyoming, USA (41° N). Overall, the model captures remarkably well the complex variations in particle concentration profiles across the different OPC size channels. However, for the 19 to 27 km injection height-range used here, both runs have a modest high bias in the lowermost stratosphere for the finest particles (radii less than 250 nm), and the decay timescale is longer in the model for these particles, with a much later return to background conditions. Also, whereas the 10 Tg run compared best to the satellite measurements, a significant low bias is apparent in the coarser size channels in the volcanically perturbed lower stratosphere. Overall, our results suggest that, with appropriate calibration, aerosol microphysics models are capable of capturing the observed variation in particle size distribution in the stratosphere across both volcanically perturbed and quiescent conditions. Furthermore, additional sensitivity simulations suggest that predictions with the models are robust to uncertainties in sub-grid particle formation and nucleation rates in the stratosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study presents an evaluation of the size and strength of convective updraughts in high-resolution simulations by the UK Met Office Unified Model (UM). Updraught velocities have been estimated from range–height indicator (RHI) Doppler velocity measurements using the Chilbolton advanced meteorological radar, as part of the Dynamical and Microphysical Evolution of Convective Storms (DYMECS) project. Based on mass continuity and the vertical integration of the observed radial convergence, vertical velocities tend to be underestimated for convective clouds due to the undetected cross-radial convergence. Velocity fields from the UM at a resolution corresponding to the radar observations are used to scale such estimates to mitigate the inherent biases. The analysis of more than 100 observed and simulated storms indicates that the horizontal scale of updraughts in simulations tend to decrease with grid length; the 200 m grid length agreed most closely with the observations. Typical updraught mass fluxes in the 500 m grid length simulations were up to an order of magnitude greater than observed, and greater still in the 1.5 km grid length simulations. The effect of increasing the mixing length in the sub-grid turbulence scheme depends on the grid length. For the 1.5 km simulations, updraughts were weakened though their horizontal scale remained largely unchanged. Progressively more so for the sub-kilometre grid lengths, updraughts were broadened and intensified; horizontal scale was now determined by the mixing length rather than the grid length. In general, simulated updraughts were found to weaken too quickly with height. The findings were supported by the analysis of the widths of reflectivity patterns in both the simulations and observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has investigated serial (temporal) clustering of extra-tropical cyclones simulated by 17 climate models that participated in CMIP5. Clustering was estimated by calculating the dispersion (ratio of variance to mean) of 30 December-February counts of Atlantic storm tracks passing nearby each grid point. Results from single historical simulations of 1975-2005 were compared to those from historical ERA40 reanalyses from 1958-2001 ERA40 and single future model projections of 2069-2099 under the RCP4.5 climate change scenario. Models were generally able to capture the broad features in reanalyses reported previously: underdispersion/regularity (i.e. variance less than mean) in the western core of the Atlantic storm track surrounded by overdispersion/clustering (i.e. variance greater than mean) to the north and south and over western Europe. Regression of counts onto North Atlantic Oscillation (NAO) indices revealed that much of the overdispersion in the historical reanalyses and model simulations can be accounted for by NAO variability. Future changes in dispersion were generally found to be small and not consistent across models. The overdispersion statistic, for any 30 year sample, is prone to large amounts of sampling uncertainty that obscures the climate change signal. For example, the projected increase in dispersion for storm counts near London in the CNRMCM5 model is 0.1 compared to a standard deviation of 0.25. Projected changes in the mean and variance of NAO are insufficient to create changes in overdispersion that are discernible above natural sampling variations.