957 resultados para Project 2002-035-C : Linking Best-Value Procurement Assessment to Outcome Performance Indicators
Resumo:
Soil organic carbon (SOC) plays a vital role in ecosystem function, determining soil fertility, water holding capacity and susceptibility to land degradation. In addition, SOC is related to atmospheric CO, levels with soils having the potential for C release or sequestration, depending on land use, land management and climate. The United Nations Convention on Climate Change and its Kyoto Protocol, and other United Nations Conventions to Combat Desertification and on Biodiversity all recognize the importance of SOC and point to the need for quantification of SOC stocks and changes. An understanding of SOC stocks and changes at the national and regional scale is necessary to further our understanding of the global C cycle, to assess the responses of terrestrial ecosystems to climate change and to aid policy makers in making land use/management decisions. Several studies have considered SOC stocks at the plot scale, but these are site specific and of limited value in making inferences about larger areas. Some studies have used empirical methods to estimate SOC stocks and changes at the regional scale, but such studies are limited in their ability to project future changes, and most have been carried out using temperate data sets. The computational method outlined by the Intergovernmental Panel on Climate Change (IPCC) has been used to estimate SOC stock changes at the regional scale in several studies, including a recent study considering five contrasting eco regions. This 'one step' approach fails to account for the dynamic manner in which SOC changes are likely to occur following changes in land use and land management. A dynamic modelling approach allows estimates to be made in a manner that accounts for the underlying processes leading to SOC change. Ecosystem models, designed for site scale applications can be linked to spatial databases, giving spatially explicit results that allow geographic areas of change in SOC stocks to be identified. Some studies have used variations on this approach to estimate SOC stock changes at the sub-national and national scale for areas of the USA and Europe and at the watershed scale for areas of Mexico and Cuba. However, a need remained for a national and regional scale, spatially explicit system that is generically applicable and can be applied to as wide a range of soil types, climates and land uses as possible. The Global Environment Facility Soil Organic Carbon (GEFSOC) Modelling System was developed in response to this need. The GEFSOC system allows estimates of SOC stocks and changes to be made for diverse conditions, providing essential information for countries wishing to take part in an emerging C market, and bringing us closer to an understanding of the future role of soils in the global C cycle. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
In the Essence project a 17-member ensemble simulation of climate change in response to the SRES A1b scenario has been carried out using the ECHAM5/MPI-OM climate model. The relatively large size of the ensemble makes it possible to accurately investigate changes in extreme values of climate variables. Here we focus on the annual-maximum 2m-temperature and fit a Generalized Extreme Value (GEV) distribution to the simulated values and investigate the development of the parameters of this distribution. Over most land areas both the location and the scale parameter increase. Consequently the 100-year return values increase faster than the average temperatures. A comparison of simulated 100-year return values for the present climate with observations (station data and reanalysis) shows that the ECHAM5/MPI-OM model, as well as other models, overestimates extreme temperature values. After correcting for this bias, it still shows values in excess of 50°C in Australia, India, the Middle East, North Africa, the Sahel and equatorial and subtropical South America at the end of the century.
Resumo:
Development research has responded to a number of charges over the past few decades. For example, when traditional research was accused of being 'top-down', the response was participatory research, linking the 'receptors' to the generators of research. As participatory processes were recognised as producing limited outcomes, the demand-led agenda was born. In response to the alleged failure of research to deliver its products, the 'joined-up' model, which links research with the private sector, has become popular. However, using examples from animal-health research, this article demonstrates that all the aforementioned approaches are seriously limited in their attempts to generate outputs to address the multi-faceted problems facing the poor. The article outlines a new approach to research: the Mosaic Model. By combining different knowledge forms, and focusing on existing gaps, the model aims to bridge basic and applied findings to enhance the efficiency and value of research, past, present, and future.
Resumo:
The crystal structure of 4-phenyl-benzaldehyde reveals the presence of a dimer linked by the C=O and C( 9)-H groups of adjacent molecules. In the liquid phase, the presence of C-(HO)-O-... bonded forms is revealed by both vibrational and NMR spectroscopy. A Delta H value of - 8.2 +/- 0.5 kJ mol(-1) for the dimerisation equilibrium is established from the temperature-dependent intensities of the bands assigned to the carbonyl-stretching modes. The NMR data suggest the preferential engagement of the C(2,6)-H and C(10/12)/C(11)-H groups as hydrogen bond donors, instead of the C(9)-H group. While ab initio calculations for the isolated dimers are unable to corroborate these NMR results, the radial distribution functions obtained from molecular dynamics simulations show a preference for C(2,6)-H and C(10/12)/C(11)-(HO)-O-... contacts relative to the C(9)-(HO)-O-... ones.
Resumo:
The UK construction industry is in the process of trying to adopt a new culture based on the large-scale take up of innovative practices. Through the Demonstration Project process many organizations are implementing changed practices and learning from the experiences of others. This is probably the largest experiment in innovation in any industry in recent times. The long-term success will be measured by the effectiveness of embedding the new practices in the organization. As yet there is no recognized approach to measuring the receptivity of the organization to the innovation process as an indication of the likelihood of long-term development. The development of an appropriate approach is described here. Existing approaches to the measurement of the take up of innovation were reviewed and where appropriate used as the base for the development of a questionnaire. The questionnaire could be applicable to multi-organizational construction project situations such that the output could determine an individual organization's innovative practices via an innovation scorecard, a project team's approach or it could be used to survey a wide cross-section of the industry.
Resumo:
The constructivist model of 'soft' value management (VM) is contrasted with the VM discourse appropriated by cost consultants who operate from within UK quantity surveying (QS) practices. The enactment of VM by cost consultants is shaped by the institutional context within which they operate and is not necessarily representative of VM practice per se. Opportunities to perform VM during the formative stages of design are further constrained by the positivistic rhetoric that such practitioners use to conceptualize and promote their services. The complex interplay between VM theory and practice is highlighted and analysed from a non-deterministic perspective. Codified models of 'best practice' are seen to be socially constructed and legitimized through human interaction in the context of interorganizational networks. Published methodologies are seen to inform practice in only a loose and indirect manner, with extensive scope for localized improvization. New insights into the relationship between VM theory and practice are derived from the dramaturgical metaphor. The social reality of VM is seen to be constituted through scripts and performances, both of which are continuously contested across organizational arenas. It is concluded that VM defies universal definition and is conceptualized and enacted differently across different localized contexts.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
It's a fact that functional verification (FV) is paramount within the hardware's design cycle. With so many new techniques available today to help with FV, which techniques should we really use? The answer is not straightforward and is often confusing and costly. The tools and techniques to be used in a project have to be decided upon early in the design cycle to get the best value for these new verification methods. This paper gives a quick survey in the form of an overview on FV, establishes the difference between verification and validation, describes the bottlenecks that appear in the verification process, examines the challenges in FV and exposes the current FV technologies and trends.
Resumo:
This paper provides a new set of theoretical perspectives on the topic of value management in building procurement. On the evidence of the current literature it is possible to identify two distinct methodologies which are based on different epistemological positions. An argument is developed which sees these two methodologies to be complementary. A tentative meta-methodology is then outlined for matching methodologies to different problem situations. It is contended however that such a meta-methodology could never provide a prescriptive guide. Its usefulness lies in the way in which it provides the basis for reflective practice. Of central importance is the need to understand the problem context within which value management is to be applied. The distinctions between unitary, pluralistic and coercive situations are seen to be especially significant.
Resumo:
The Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE) project has produced a global data-set of cloud and aerosol properties from the Along Track Scanning Radiometer-2 (ATSR-2) instrument, covering the time period 1995�2001. This paper presents the validation of aerosol optical depths (AODs) over the ocean from this product against AERONET sun-photometer measurements, as well as a comparison to the Advanced Very High Resolution Radiometer (AVHRR) optical depth product produced by the Global Aerosol Climatology Project (GACP). The GRAPE AOD over ocean is found to be in good agreement with AERONET measurements, with a Pearson's correlation coefficient of 0.79 and a best-fit slope of 1.0±0.1, but with a positive bias of 0.08±0.04. Although the GRAPE and GACP datasets show reasonable agreement, there are significant differences. These discrepancies are explored, and suggest that the downward trend in AOD reported by GACP may arise from changes in sampling due to the orbital drift of the AVHRR instruments.
Resumo:
There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.