102 resultados para The 7pm Project
Resumo:
For users of climate services, the ability to quickly determine the datasets that best fit one's needs would be invaluable. The volume, variety and complexity of climate data makes this judgment difficult. The ambition of CHARMe ("Characterization of metadata to enable high-quality climate services") is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports or feedback on previous applications of the data. The capture and discovery of this "commentary" information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator ("CHARMe Maps") and a tool for correlating climate time series with external "significant events" (e.g. instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source, released under a liberal licence, permitting future projects to re-use the source code as they wish.
Resumo:
An efficient and robust method to measure vitamin D (25-hydroxy vitamin D3 (25(OH)D3) and 25-hydroxy vitamin D2 in dried blood spots (DBS) has been developed and applied in the pan-European multi-centre, internet-based, personalised nutrition intervention study Food4Me. The method includes calibration with blood containing endogenous 25(OH)D3, spotted as DBS and corrected for haematocrit content. The methodology was validated following international standards. The performance characteristics did not reach those of the current gold standard liquid chromatography-MS/MS in plasma for all parameters, but were found to be very suitable for status-level determination under field conditions. DBS sample quality was very high, and 3778 measurements of 25(OH)D3 were obtained from 1465 participants. The study centre and the season within the study centre were very good predictors of 25(OH)D3 levels (P<0·001 for each case). Seasonal effects were modelled by fitting a sine function with a minimum 25(OH)D3 level on 20 January and a maximum on 21 July. The seasonal amplitude varied from centre to centre. The largest difference between winter and summer levels was found in Germany and the smallest in Poland. The model was cross-validated to determine the consistency of the predictions and the performance of the DBS method. The Pearson's correlation between the measured values and the predicted values was r 0·65, and the sd of their differences was 21·2 nmol/l. This includes the analytical variation and the biological variation within subjects. Overall, DBS obtained by unsupervised sampling of the participants at home was a viable methodology for obtaining vitamin D status information in a large nutritional study.
Resumo:
A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.
Resumo:
TIGGE was a major component of the THORPEX (The Observing System Research and Predictability Experiment) research program, whose aim is to accelerate improvements in forecasting high-impact weather. By providing ensemble prediction data from leading operational forecast centers, TIGGE has enhanced collaboration between the research and operational meteorological communities and enabled research studies on a wide range of topics. The paper covers the objective evaluation of the TIGGE data. For a range of forecast parameters, it is shown to be beneficial to combine ensembles from several data providers in a Multi-model Grand Ensemble. Alternative methods to correct systematic errors, including the use of reforecast data, are also discussed. TIGGE data have been used for a range of research studies on predictability and dynamical processes. Tropical cyclones are the most destructive weather systems in the world, and are a focus of multi-model ensemble research. Their extra-tropical transition also has a major impact on skill of mid-latitude forecasts. We also review how TIGGE has added to our understanding of the dynamics of extra-tropical cyclones and storm tracks. Although TIGGE is a research project, it has proved invaluable for the development of products for future operational forecasting. Examples include the forecasting of tropical cyclone tracks, heavy rainfall, strong winds, and flood prediction through coupling hydrological models to ensembles. Finally the paper considers the legacy of TIGGE. We discuss the priorities and key issues in predictability and ensemble forecasting, including the new opportunities of convective-scale ensembles, links with ensemble data assimilation methods, and extension of the range of useful forecast skill.
Resumo:
The management of a public sector project is analysed using a model developed from systems theory. Linear responsibility analysis is used to identify the primary and key decision structure of the project and to generate quantitative data regarding differentiation and integration of the operating system, the managing system and the client/project team. The environmental context of the project is identified. Conclusions are drawn regarding the project organization structure's ability to cope with the prevailing environmental conditions. It is found that the complexity of the managing system imposed on the project was unable to achieve this and created serious deficiencies in the outcome of the project.
Resumo:
The UK construction industry is in the process of trying to adopt a new culture based on the large-scale take up of innovative practices. Through the Demonstration Project process many organizations are implementing changed practices and learning from the experiences of others. This is probably the largest experiment in innovation in any industry in recent times. The long-term success will be measured by the effectiveness of embedding the new practices in the organization. As yet there is no recognized approach to measuring the receptivity of the organization to the innovation process as an indication of the likelihood of long-term development. The development of an appropriate approach is described here. Existing approaches to the measurement of the take up of innovation were reviewed and where appropriate used as the base for the development of a questionnaire. The questionnaire could be applicable to multi-organizational construction project situations such that the output could determine an individual organization's innovative practices via an innovation scorecard, a project team's approach or it could be used to survey a wide cross-section of the industry.
Resumo:
Sea surface temperature (SST) measurements are required by operational ocean and atmospheric forecasting systems to constrain modeled upper ocean circulation and thermal structure. The Global Ocean Data Assimilation Experiment (GODAE) High Resolution SST Pilot Project (GHRSST-PP) was initiated to address these needs by coordinating the provision of accurate, high-resolution, SST products for the global domain. The pilot project is now complete, but activities continue within the Group for High Resolution SST (GHRSST). The pilot project focused on harmonizing diverse satellite and in situ data streams that were indexed, processed, quality controlled, analyzed, and documented within a Regional/Global Task Sharing (R/GTS) framework implemented in an internationally distributed manner. Data with meaningful error estimates developed within GHRSST are provided by services within R/GTS. Currently, several terabytes of data are processed at international centers daily, creating more than 25 gigabytes of product. Ensemble SST analyses together with anomaly SST outputs are generated each day, providing confidence in SST analyses via diagnostic outputs. Diagnostic data sets are generated and Web interfaces are provided to monitor the quality of observation and analysis products. GHRSST research and development projects continue to tackle problems of instrument calibration, algorithm development, diurnal variability, skin temperature deviation, and validation/verification of GHRSST products. GHRSST also works closely with applications and users, providing a forum for discussion and feedback between SST users and producers on a regular basis. All data within the GHRSST R/GTS framework are freely available. This paper reviews the progress of GHRSST-PP, highlighting achievements that have been fundamental to the success of the pilot project.
Resumo:
Many construction professionals and policy-makers would agree that client expectations should be accommodated during a building project. However, this aspiration is not easy to deal with as there may be conflicting interests within a client organization and these may change over time in the course of a project. This research asks why some client interests, and not others, are incorporated into the development of a building project. Actor-Network Theory (ANT) is used to study a single building project on a University campus. The building project is analysed as a number of discussions and negotiations, in which actors persuade each other to choose one solution over another. The analysis traces dynamic client engagement in decision-making processes as available options became increasingly constrained. However, this relative loss of control was countered by clients who continued the control over the timing of participants' involvement, and thus the way to impose their interests even at the later stage of the project.
Resumo:
North African dust is important for climate through its direct radiative effect on solar and terrestrial radiation and its role in the biogeochemical system. The Dust Outflow and Deposition to the Ocean project (DODO) aimed to characterize the physical and optical properties of airborne North African dust in two seasons and to use these observations to constrain model simulations, with the ultimate aim of being able to quantify the deposition of iron to the North Atlantic Ocean. The in situ properties of dust from airborne campaigns measured during February and August 2006, based at Dakar, Senegal, are presented here. Average values of the single scattering albedo (0.99, 0.98), mass specific extinction (0.85 m^2 g^-1 , 1.14 m^2 g^-1 ), asymmetry parameter (0.68, 0.68), and refractive index (1.53--0.0005i,1.53--0.0014i) for the accumulation mode were found to differ by varying degrees between the dry and wet season, respectively. It is hypothesized that these differences are due to different source regions and transport processes which also differ between the DODO campaigns. Elemental ratios of Ca/Al were found to differ between the dry and wet season (1.1 and 0.5, respectively). Differences in vertical profiles are found between seasons and between land and ocean locations and reflect the different dynamics of the seasons. Using measurements of the coarse mode size distribution and illustrative Mie calculations, the optical properties are found to be very sensitive to the presence and amount of coarse mode of mineral dust, and the importance of accurate measurements of the coarse mode of dust is highlighted.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles
Resumo:
As part of its Data User Element programme, the European Space Agency funded the GlobMODEL project which aimed at investigating the scientific, technical, and organizational issues associated with the use and exploitation of remotely-sensed observations, particularly from new sounders. A pilot study was performed as a "demonstrator" of the GlobMODEL idea, based on the use of new data, with a strong European heritage, not yet assimilated operationally. Two parallel assimilation experiments were performed, using either total column ozone or ozone profiles retrieved at the Royal Netherlands Meteorological Institute (KNMI) from the Ozone Monitoring Instrument (OMI). In both cases, the impact of assimilating OMI data in addition to the total ozone columns from the SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY) on the European Centre for Medium Range Weather Forecasts (ECMWF) ozone analyses was assessed by means of independent measurements. We found that the impact of OMI total columns is mainly limited to the region between 20 and 80 hPa, and is particularly important at high latitudes in the Southern hemisphere where the stratospheric ozone transport and chemical depletion are generally difficult to model with accuracy. Furthermore, the assimilation experiments carried out in this work suggest that OMI DOAS (Differential Optical Absorption Spectroscopy) total ozone columns are on average larger than SCIAMACHY total columns by up to 3 DU, while OMI total columns derived from OMI ozone profiles are on average about 8 DU larger than SCIAMACHY total columns. At the same time, the demonstrator brought to light a number of issues related to the assimilation of atmospheric composition profiles, such as the shortcomings arising when the vertical resolution of the instrument is not properly accounted for in the assimilation. The GlobMODEL demonstrator accelerated scientific and operational utilization of new observations and its results - prompted ECMWF to start the operational assimilation of OMI total column ozone data.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an extension to existing buildings to provide a warehouse, services block and packing line. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of building provision can be viewed as systems and subsystems which are differentiated from each other at decision points. Further to this, the subsystems can be viewed as the interaction of managing system and operating system. Using Walker's model, a systematic analysis of the relationships between the contributors gives a quantitative assessment of the efficacy of the organizational structure used. The causes of the client's dissatisfaction with the outcome of the project were lack of integration and complexity of the managing system. However, there was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's propositions.
Resumo:
The GEFSOC Project developed a system for estimating soil carbon (C) stocks and changes at the national and sub-national scale. As part of the development of the system, the Century ecosystem model was evaluated for its ability to simulate soil organic C (SOC) changes in environmental conditions in the Indo-Gangetic Plains, India (IGP). Two long-term fertilizer trials (LTFT), with all necessary parameters needed to run Century, were used for this purpose: a jute (Corchorus capsularis L.), rice (Oryza sativa L.) and wheat (Triticum aestivum L.) trial at Barrackpore, West Bengal, and a rice-wheat trial at Ludhiana, Punjab. The trials represent two contrasting climates of the IGP, viz. semi-arid, dry with mean annual rainfall (MAR) of < 800 mm and humid with > 1600 turn. Both trials involved several different treatments with different organic and inorganic fertilizer inputs. In general, the model tended to overestimate treatment effects by approximately 15%. At the semi-arid site, modelled data simulated actual data reasonably well for all treatments, with the control and chemical N + farm yard manure showing the best agreement (RMSE = 7). At the humid site, Century performed less well. This could have been due to a range of factors including site history. During the study, Century was calibrated to simulate crop yields for the two sites considered using data from across the Indian IGP. However, further adjustments may improve model performance at these sites and others in the IGP. The availability of more longterm experimental data sets (especially those involving flooded lowland rice and triple cropping systems from the IGP) for testing and validation is critical to the application of the model's predictive capabilities for this area of the Indian sub-continent. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an engineering factory and offices. A multi-disciplinary professional practice was used to manage and design the project. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of buildings provision can be viewed as systems and sub-systems which are differentiated form each other at decision points. Further to this, the sub-systematic analysis of the relationship between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. There was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's proposition. However, the project was subject to string environmental forces which the project organization was not capable of entirely overcoming.