902 resultados para Works in Progress


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Phosphorus Indicators Tool provides a catchment-scale estimation of diffuse phosphorus (P) loss from agricultural land to surface waters using the most appropriate indicators of P loss. The Tool provides a framework that may be applied across the UK to estimate P loss, which is sensitive not only to land use and management but also to environmental factors such as climate, soil type and topography. The model complexity incorporated in the P Indicators Tool has been adapted to the level of detail in the available data and the need to reflect the impact of changes in agriculture. Currently, the Tool runs on an annual timestep and at a 1 km(2) grid scale. We demonstrate that the P Indicators Tool works in principle and that its modular structure provides a means of accounting for P loss from one layer to the next, and ultimately to receiving waters. Trial runs of the Tool suggest that modelled P delivery to water approximates measured water quality records. The transparency of the structure of the P Indicators Tool means that identification of poorly performing coefficients is possible, and further refinements of the Tool can be made to ensure it is better calibrated and subsequently validated against empirical data, as it becomes available.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Previous work has established the value of goal-oriented approaches to requirements engineering. Achieving clarity and agreement about stakeholders’ goals and assumptions is critical for building successful software systems and managing their subsequent evolution. In general, this decision-making process requires stakeholders to understand the implications of decisions outside the domains of their own expertise. Hence it is important to support goal negotiation and decision making with description languages that are both precise and expressive, yet easy to grasp. This paper presents work in progress to develop a pattern language for describing goal refinement graphs. The language has a simple graphical notation, which is supported by a prototype editor tool, and a symbolic notation based on modal logic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Standardisation of microsatellite allele profiles between laboratories is of fundamental importance to the transferability of genetic fingerprint data and the identification of clonal individuals held at multiple sites. Here we describe two methods of standardisation applied to the microsatellite fingerprinting of 429 Theobroma cacao L. trees representing 345 accessions held in the worlds largest Cocoa Intermediate Quarantine facility: the use of a partial allelic ladder through the production of 46 cloned and sequenced allelic standards (AJ748464 to AJ48509), and the use of standard genotypes selected to display a diverse allelic range. Until now a lack of accurate and transferable identification information has impeded efforts to genetically improve the cocoa crop. To address this need, a global initiative to fingerprint all international cocoa germplasm collections using a common set of 15 microsatellite markers is in progress. Data reported here have been deposited with the International Cocoa Germplasm Database and form the basis of a searchable resource for clonal identification. To our knowledge, this is the first quarantine facility to be completely genotyped using microsatellite markers for the purpose of quality control and clonal identification. Implications of the results for retrospective tracking of labelling errors are briefly explored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A proper method to assess contractor competitiveness is important both for assisting clients in the selection of proper contractors and for assisting contractors in the development of more competitive bidding strategies. Previous studies have identified various indicators for assessing contractor competitiveness, and several assessment methods have been introduced. Nevertheless, these studies are limited because they are unable to tell which indicators are more important in different market environments. This paper identifies the key competitiveness indicators �KCIs� for assessing contractor competitiveness in the Chinese construction market. An index value is used to indicate the relative significance of various competitiveness indicators based on which KCIs are identified. The data applied in this study are from a survey of the construction industry in mainland China. The research findings provide valuable information for both existing businesses and the construction professionals who plan to compete for construction works in the Chinese market. The study provides useful references for further studies that compare the KCIs used in the Chinese construction industry and those used in other construction industries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the essential needs to implement a successful e-Government web application is security. Web application firewalls (WAF) are the most important tool to secure web applications against the increasing number of web application attacks nowadays. WAFs work in different modes depending on the web traffic filtering approach used, such as positive security mode, negative security mode, session-based mode, or mixed modes. The proposed WAF, which is called (HiWAF), is a web application firewall that works in three modes: positive, negative and session based security modes. The new approach that distinguishes this WAF among other WAFs is that it utilizes the concepts of Artificial Intelligence (AI) instead of regular expressions or other traditional pattern matching techniques as its filtering engine. Both artificial neural networks and fuzzy logic concepts will be used to implement a hybrid intelligent web application firewall that works in three security modes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – To describe some research done, as part of an EPSRC funded project, to assist engineers working together on collaborative tasks. Design/methodology/approach – Distributed finite state modelling and agent techniques are used successfully in a new hybrid self-organising decision making system applied to collaborative work support. For the particular application, analysis of the tasks involved has been performed and these tasks are modelled. The system then employs a novel generic agent model, where task and domain knowledge are isolated from the support system, which provides relevant information to the engineers. Findings – The method is applied in the despatch of transmission commands within the control room of The National Grid Company Plc (NGC) – tasks are completed significantly faster when the system is utilised. Research limitations/implications – The paper describes a generic approach and it would be interesting to investigate how well it works in other applications. Practical implications – Although only one application has been studied, the methodology could equally be applied to a general class of cooperative work environments. Originality/value – One key part of the work is the novel generic agent model that enables the task and domain knowledge, which are application specific, to be isolated from the support system, and hence allows the method to be applied in other domains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is concerned with the impacts on property returns from property fund flows, and with the possibility of a reverse transmission from property fund flows to property returns. In other words this study investigates whether property returns “cause” fund flow changes, or whether fund flow changes “cause” property returns, or causality works in both directions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivation: A new method that uses support vector machines (SVMs) to predict protein secondary structure is described and evaluated. The study is designed to develop a reliable prediction method using an alternative technique and to investigate the applicability of SVMs to this type of bioinformatics problem. Methods: Binary SVMs are trained to discriminate between two structural classes. The binary classifiers are combined in several ways to predict multi-class secondary structure. Results: The average three-state prediction accuracy per protein (Q3) is estimated by cross-validation to be 77.07 ± 0.26% with a segment overlap (Sov) score of 73.32 ± 0.39%. The SVM performs similarly to the 'state-of-the-art' PSIPRED prediction method on a non-homologous test set of 121 proteins despite being trained on substantially fewer examples. A simple consensus of the SVM, PSIPRED and PROFsec achieves significantly higher prediction accuracy than the individual methods. Availability: The SVM classifier is available from the authors. Work is in progress to make the method available on-line and to integrate the SVM predictions into the PSIPRED server.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the first ‘Science and Medicine’ chapter in The Year’s Work in Critical and Cultural Theory. It is synchronised with the journal’s other chapters in its reviewing of works published in 2010, while it also reads these works in the broader context of rapidly expanding interdisciplinary areas of research. Scientific and medical vocabularies are (to use a scientific metaphor) cross-pollinating within many areas of scholarship in the humanities, and this current period is bringing many exciting developments. This chapter concentrates on literary studies, while forthcoming chapters will also look more squarely at cultural studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting point for these outputs is a large scale research project in collaboration with the Zurich University for the Arts and the Kunstmuseum Thun, looking at a redefinition of Social Sculpture (Joseph Beuys/ Bazon Brock, 1970) as a functional device re-deployed to expand the art discourse into a societal discourse. Although Beuys‘ version of a social sculpture involved notions of abstruse mysticism and reformulations of a national identity these were never-the less part of a social transformation that shifted and re-arranged power relations. Following Laclau and Mouffe in their contention that democray is a fundamentally antagonistic process and contesting Grant Kester’s understanding of a ethically based relational practice, this work is alignes itself with Hirschhorn’s claim to an aesthetic practice within communities, following the possibility to view a socially based practice from both ends of the ethics debate, whereby ethical aspects fuels the aethetic to “create situations that are beautiful because they are ethical and shocking because they are ethical, thus in turn aesthetic because they are ethical” (O’Donnell). This project sets out to engage in activities which interact with surrounding communities and evoce new imaginations of site, thereby understanding site as a catalysts for subjective emergences. Performance is tested as a site for social practice. Archival research into local audio/visual resources, such as the Swiss Radio Archive, the Swiss Military Film Archives and zoological film archives of the Basel Zoo, was instrumental to the navigation of this work, under theme of crisis, catastrophy, landscape, fallout, in order to create a visual language for an active performance site. Commissioned by the Kunstmuseum Thun in collaboration with the University for the Arts in Zurich as part of a year long exhibition programme, (other artists are Jeanne Van Heeswijk (NL) and San Keller (CH), ) this project brings together a series of different works in a new performace installation. The performance process includes a performance workshop with 30 school children from local Swiss schools and their teachers, which was conducted publicly in the museum spaces. It enabled the children to engage with an unexpected set of tribal and animalistic behaviours, looking at situations of flight and rescue, resulting in a large performance choreography orchestration without an apparent conductor, it includes a collaboration with renowned Swiss zoologist, Prof Klaus Zuberbühler(University of St Andrews) and the Colonal General Haldimann commander of the military base in Thun. The installation included 2 static video images, shot in an around spectacular local cave site (Beatus Caves) including 3 children. The project will culminate in an edited edition of the Oncurating Journal, (issue no, tbc, in 2012) including interviews and essays from project collaborators. (Army Commander General, Thun, Jörg Hess, performance script, Timothy Long, and others)