62 resultados para Tools and techniques
Resumo:
A review is given of the major conceptual changes that have taken place during the last 50 years in our understanding of the nature of plant conservation and of the principal methodological advances in undertaking conservation assessments and actions, largely through the incorporation of tools and techniques from other disciplines. The interrelationships between conservation and sustainable use are considered as well as the impact of the development of the discipline of conservation biology, the effects of the general acceptance of the concept of biodiversity and the practical implications of the implementation of the Convention on Biological diversity. The effect on conservation policy and management of the accelerating loss or conversion of habitats throughout the world and approaches for combating this are discussed.
Resumo:
Purpose – This paper proposes assessing the context within which integrated logistic support (ILS) can be implemented for whole life performance of building services systems. Design/methodology/approach – The use of ILS within a through-life business model (TLBM) is a better framework to achieve a well-designed, constructed and managed product. However, for ILS to be implemented in a TLBM for building services systems, the practices, tools and techniques need certain contextual prerequisites tailored to suit the construction industry. These contextual prerequisites are discussed. Findings – The case studies conducted reinforced the contextual importance of prime contracting, partnering and team collaboration for the application of ILS techniques. The lack of data was a major hindrance to the full realisation of ILS techniques within the case studies. Originality/value – The paper concludes with the recognition of the value of these contextual prerequisites for the use of ILS techniques within the building industry.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.
Resumo:
Pressing global environmental problems highlight the need to develop tools to measure progress towards "sustainability." However, some argue that any such attempt inevitably reflects the views of those creating such tools and only produce highly contested notions of "reality." To explore this tension, we critically assesses the Environmental Sustainability Index (ESI), a well-publicized product of the World Economic Forum that is designed to measure 'sustainability' by ranking nations on league tables based on extensive databases of environmental indicators. By recreating this index, and then using statistical tools (principal components analysis) to test relations between various components of the index, we challenge ways in which countries are ranked in the ESI. Based on this analysis, we suggest (1) that the approach taken to aggregate, interpret and present the ESI creates a misleading impression that Western countries are more sustainable than the developing world; (2) that unaccounted methodological biases allowed the authors of the ESI to over-generalize the relative 'sustainability' of different countries; and, (3) that this has resulted in simplistic conclusions on the relation between economic growth and environmental sustainability. This criticism should not be interpreted as a call for the abandonment of efforts to create standardized comparable data. Instead, this paper proposes that indicator selection and data collection should draw on a range of voices, including local stakeholders as well as international experts. We also propose that aggregating data into final league ranking tables is too prone to error and creates the illusion of absolute and categorical interpretations. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society
Resumo:
P>To address whether seasonal variability exists among Shiga toxin-encoding bacteriophage (Stx phage) numbers on a cattle farm, conventional plaque assay was performed on water samples collected over a 17 month period. Distinct seasonal variation in bacteriophage numbers was evident, peaking between June and August. Removal of cattle from the pasture precipitated a reduction in bacteriophage numbers, and during the winter months, no bacteriophage infecting Escherichia coli were detected, a surprising occurrence considering that 1031 tailed-bacteriophages are estimated to populate the globe. To address this discrepancy a culture-independent method based on quantitative PCR was developed. Primers targeting the Q gene and stx genes were designed that accurately and discriminately quantified artificial mixed lambdoid bacteriophage populations. Application of these primer sets to water samples possessing no detectable phages by plaque assay, demonstrated that the number of lambdoid bacteriophage ranged from 4.7 x 104 to 6.5 x 106 ml-1, with one in 103 free lambdoid bacteriophages carrying a Shiga toxin operon (stx). Specific molecular biological tools and discriminatory gene targets have enabled virus populations in the natural environment to be enumerated and similar strategies could replace existing propagation-dependent techniques, which grossly underestimate the abundance of viral entities.
Resumo:
The effects and influence of the Building Research Establishment’s Environmental Assessment Methods (BREEAM) on construction professionals are examined. Most discussions of building assessment methods focus on either the formal tool or the finished product. In contrast, BREEAM is analysed here as a social technology using Michel Foucault’s theory of governmentality. Interview data are used to explore the effect of BREEAM on visibilities, knowledge, techniques and professional identities. The analysis highlights a number of features of the BREEAM assessment process which generally go unremarked: professional and public understandings of the method, the deployment of different types of knowledge and their implication for the authority and legitimacy of the tool, and the effect of BREEAM on standard practice. The analysis finds that BREEAM’s primary effect is through its impact on standard practices. Other effects include the use of assessment methods to defend design decisions, its role in both operationalizing and obscuring the concept of green buildings, and the effect of tensions between project and method requirements for the authority of the tool. A reflection on assessment methods as neo-liberal tools and their adequacy for the promotion of sustainable construction suggests several limitations of lock-in that hinder variation and wider systemic change.
Resumo:
Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.
Resumo:
The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e. g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.
Resumo:
Results from the first Sun-to-Earth coupled numerical model developed at the Center for Integrated Space Weather Modeling are presented. The model simulates physical processes occurring in space spanning from the corona of the Sun to the Earth's ionosphere, and it represents the first step toward creating a physics-based numerical tool for predicting space weather conditions in the near-Earth environment. Two 6- to 7-d intervals, representing different heliospheric conditions in terms of the three-dimensional configuration of the heliospheric current sheet, are chosen for simulations. These conditions lead to drastically different responses of the simulated magnetosphere-ionosphere system, emphasizing, on the one hand, challenges one encounters in building such forecasting tools, and on the other hand, emphasizing successes that can already be achieved even at this initial stage of Sun-to-Earth modeling.
Resumo:
The development of high throughput techniques ('chip' technology) for measurement of gene expression and gene polymorphisms (genomics), and techniques for measuring global protein expression (proteomics) and metabolite profile (metabolomics) are revolutionising life science research, including research in human nutrition. In particular, the ability to undertake large-scale genotyping and to identify gene polymorphisms that determine risk of chronic disease (candidate genes) could enable definition of an individual's risk at an early age. However, the search for candidate genes has proven to be more complex, and their identification more elusive, than previously thought. This is largely due to the fact that much of the variability in risk results from interactions between the genome and environmental exposures. Whilst the former is now very well defined via the Human Genome Project, the latter (e.g. diet, toxins, physical activity) are poorly characterised, resulting in inability to account for their confounding effects in most large-scale candidate gene studies. The polygenic nature of most chronic diseases offers further complexity, requiring very large studies to disentangle relatively weak impacts of large numbers of potential 'risk' genes. The efficacy of diet as a preventative strategy could also be considerably increased by better information concerning gene polymorphisms that determine variability in responsiveness to specific diet and nutrient changes. Much of the limited available data are based on retrospective genotyping using stored samples from previously conducted intervention trials. Prospective studies are now needed to provide data that can be used as the basis for provision of individualised dietary advice and development of food products that optimise disease prevention. Application of the new technologies in nutrition research offers considerable potential for development of new knowledge and could greatly advance the role of diet as a preventative disease strategy in the 21st century. Given the potential economic and social benefits offered, funding for research in this area needs greater recognition, and a stronger strategic focus, than is presently the case. Application of genomics in human health offers considerable ethical and societal as well as scientific challenges. Economic determinants of health care provision are more likely to resolve such issues than scientific developments or altruistic concerns for human health.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
An increasing set of evidence has been reported on how consumers could potentially react to the introduction of genetically modified food. Studies typically contain some empirical evidence and some theoretical explanations of the data, however, to date limited effort has been posed on systematically reviewing the existing evidence and its implications for policy. This paper contributes to the literature by bringing together the published evidence on the behavioural frameworks and evidence on the process leading to the public acceptance of genetically modified (GM) food and organisms (GMOs). In doing so, we employ a set of clearly defined search tools and a limited number of comprehensive key words. The study attempts to gather an understanding of the published findings on the determinants of the valuation of GM food - both in terms of willingness to accept and the willing-to-pay a premium for non-GM food, trust with information sources on the safety and public health and ultimate attitudes underpinning such evidence. Furthermore, in the light of such evidence, we formulate some policy strategies to deal with public uncertainly regarding to GMOs and, especially GM food. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The paper examines how European retailers are using private standards for food safety and,quality as risk management and competitive tools and the strategic responses of leading Kenyan and other developing country supplier/exporters to such standards. Despite measures to harmonize a 'single market', the European fresh produce market is very diverse in terms of consumer preferences, structural dynamics and attention to and enforcement of food safety and other standards. Leading Kenyan fresh produce suppliers have re-positioned themselves at the high end, including 'high care', segments of the market - precisely those that are most demanding in terms of quality assurance and food safety systems. An array of factors have influenced this strategic positioning, including relatively high international freight costs, the emergence of more effective competition in mainstream product lines, relatively low labor costs for produce preparation, and strong market relationships with selected retail chains. To succeed in this demanding market segment, the industry has had to invest substantially in improved production and procurement systems, upgraded pack house facilities, and quality assurance/food safety management systems. (C) 2005 Elsevier Ltd. All rights reserved.