968 resultados para Cascading Style Sheets Workshop
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.
Resumo:
Large temperature variations on land, in the air, and at the ocean surface, and highly variable flux of ice-rafted debris (IRD) delivered to the North Atlantic Ocean show that rapid climate fluctuations took place during the last glacial period. These quasi-periodic, high-amplitude climate variations followed a sequence of events recognized as a rapid warming, followed by a phase of gradual cooling, and terminating with more rapid cooling and increased flux of IRD to the north Atlantic Ocean. Each cycle lasted ˜1500 years, and was followed by an almost identical sequence. These cycles are referred to as Dansgaard/Oechger cycles (D/O cycles), and approximately every fourth cycle culminated in a more pronounced cooling with a massive discharge of IRD into the north Atlantic Ocean over an interval of ˜500 years. These massive discharges of IRD are known as Heinrich layers. “Heinrich events” are thus characterized as a rapid transfer of IRD from a “source,” the bed of the Laurentide Ice Sheet (LIS), to a “sink,” the North Atlantic.
Resumo:
Mature (clitellate) Eisenia andrei Bouche (ultra epigeic), Lumbricus rubellus Hoffmeister (epigeic), and Aporrectodea caliginosa (Savigny) (endogeic) earthworms were placed in soils treated with Pb(NO3)(2) to have concentrations in the range 1000 to 10 000 mg Pb kg(-1). After 28 days LC50(-95%confidence limit) (+95%confidence limit) values were E. andrei 5824(-361)(+898) mg Pb kg(-1), L. rubellus 2867(-193)(+145) mg Pb kg(-1) and A. caliginosa 2747(-304)(+239) mg Pb kg(-1) and EC50s for weight change were E. andrei 2841(-68)(+150) Pb kg(-1), L. rubellus 1303(-201)(+204) mg Pb kg(-1) and A. caliginosa 1208(-206)(+212) Mg Pb kg(-1). At any given soil Pb concentration, Pb tissue concentrations after 28 days were the same for all three earthworm species. In a soil avoidance test there was no difference between the behaviour of the different species. The lower sensitivity to Pb exhibited by E. andrei is most likely due to physiological adaptations associated with the modes of life of the earthworms, and could have serious implications for the use of this earthworm as the species of choice in standard toxicological testing. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In developing techniques for monitoring the costs associated with different procurement routes, the central task is disentangling the various project costs incurred by organizations taking part in construction projects. While all firms are familiar with the need to analyse their own costs, it is unusual to apply the same kind of analysis to projects. The purpose of this research is to examine the claims that new ways of working such as strategic alliancing and partnering bring positive business benefits. This requires that costs associated with marketing, estimating, pricing, negotiation of terms, monitoring of performance and enforcement of contract are collected for a cross-section of projects under differing arrangements, and from those in the supply chain from clients to consultants, contractors, sub-contractors and suppliers. Collaboration with industrial partners forms the basis for developing a research instrument, based on time sheets, which will be relevant for all those taking part in the work. The signs are that costs associated with tendering are highly variable, 1-15%, depending upon what precisely is taken into account. The research to date reveals that there are mechanisms for measuring the costs of transactions and these will generate useful data for subsequent analysis.