995 resultados para Project visualization
Resumo:
The Convective Storm Initiation Project (CSIP) is an international project to understand precisely where, when, and how convective clouds form and develop into showers in the mainly maritime environment of southern England. A major aim of CSIP is to compare the results of the very high resolution Met Office weather forecasting model with detailed observations of the early stages of convective clouds and to use the newly gained understanding to improve the predictions of the model. A large array of ground-based instruments plus two instrumented aircraft, from the U.K. National Centre for Atmospheric Science (NCAS) and the German Institute for Meteorology and Climate Research (IMK), Karlsruhe, were deployed in southern England, over an area centered on the meteorological radars at Chilbolton, during the summers of 2004 and 2005. In addition to a variety of ground-based remote-sensing instruments, numerous rawin-sondes were released at one- to two-hourly intervals from six closely spaced sites. The Met Office weather radar network and Meteosat satellite imagery were used to provide context for the observations made by the instruments deployed during CSIP. This article presents an overview of the CSIP field campaign and examples from CSIP of the types of convective initiation phenomena that are typical in the United Kingdom. It shows the way in which certain kinds of observational data are able to reveal these phenomena and gives an explanation of how the analyses of data from the field campaign will be used in the development of an improved very high resolution NWP model for operational use.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
In the event of a release of toxic gas in the center of London, the emergency services would need to determine quickly the extent of the area contaminated. The transport of pollutants by turbulent flow within the complex street and building architecture of cities is not straightforward, and we might wonder whether it is at all possible to make a scientifically-reasoned decision. Here we describe recent progress from a major UK project, ‘Dispersion of Air Pollution and its Penetration into the Local Environment’ (DAPPLE, www.dapple.org.uk). In DAPPLE, we focus on the movement of airborne pollutants in cities by developing a greater understanding of atmospheric flow and dispersion within urban street networks. In particular, we carried out full-scale dispersion experiments in central London (UK) during 2003, 2004, 2007, and 2008 to address the extent of the dispersion of tracers following their release at street level. These measurements complemented previous studies because (i) our focus was on dispersion within the first kilometer from the source, when most of the material was expected to remain within the street network rather than being mixed into the boundary layer aloft, (ii) measurements were made under a wide variety of meteorological conditions, and (iii) central London represents a European, rather than North American, city geometry. Interpretation of the results from the full-scale experiments was supported by extensive numerical and wind tunnel modeling, which allowed more detailed analysis under idealized and controlled conditions. In this article, we review the full-scale DAPPLE methodologies and show early results from the analysis of the 2007 field campaign data.
Resumo:
This paper reports on a new satellite sensor, the Geostationary Earth Radiation Budget (GERB) experiment. GERB is designed to make the first measurements of the Earth's radiation budget from geostationary orbit. Measurements at high absolute accuracy of the reflected sunlight from the Earth, and the thermal radiation emitted by the Earth are made every 15 min, with a spatial resolution at the subsatellite point of 44.6 km (north–south) by 39.3 km (east–west). With knowledge of the incoming solar constant, this gives the primary forcing and response components of the top-of-atmosphere radiation. The first GERB instrument is an instrument of opportunity on Meteosat-8, a new spin-stabilized spacecraft platform also carrying the Spinning Enhanced Visible and Infrared (SEVIRI) sensor, which is currently positioned over the equator at 3.5°W. This overview of the project includes a description of the instrument design and its preflight and in-flight calibration. An evaluation of the instrument performance after its first year in orbit, including comparisons with data from the Clouds and the Earth's Radiant Energy System (CERES) satellite sensors and with output from numerical models, are also presented. After a brief summary of the data processing system and data products, some of the scientific studies that are being undertaken using these early data are described. This marks the beginning of a decade or more of observations from GERB, as subsequent models will fly on each of the four Meteosat Second Generation satellites.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an extension to existing buildings to provide a warehouse, services block and packing line. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of building provision can be viewed as systems and subsystems which are differentiated from each other at decision points. Further to this, the subsystems can be viewed as the interaction of managing system and operating system. Using Walker's model, a systematic analysis of the relationships between the contributors gives a quantitative assessment of the efficacy of the organizational structure used. The causes of the client's dissatisfaction with the outcome of the project were lack of integration and complexity of the managing system. However, there was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's propositions.
Resumo:
The practical application of systemic sustainability analysis (SSA; Bell and Morse, 1999) as applied in-a project instigated and managed by 'Blue Plan', one of the regional activity centres of the Mediterranean Action Plan, is set out and explained in this paper. The context in which SSA was applied and adapted to SPSA (systemic and prospective sustainability analysis). is described in the Mediterranean, primarily in Malta. The SSA process is summarized, its extension and linkage to the prospective approach is described and the comments of stakeholders in the context are added. Some preliminary outcomes are suggested. The pauticular focus of the paper is on the lessons learned from doing SSA/SPSA within a classic blueprint project framework. It is-not assumed that SSA/SPSA is 'finished' or 'definitive'. Rather, we suggest that it is a developing and changing approach that practitioners can adapt and change to meet the specific needs of the circumstances that confront them. Copyright (C) 2004 John Wiley & Sons, Ltd and ERP Environment.