818 resultados para Hiking -- Tools and equipment
Resumo:
An increasing set of evidence has been reported on how consumers could potentially react to the introduction of genetically modified food. Studies typically contain some empirical evidence and some theoretical explanations of the data, however, to date limited effort has been posed on systematically reviewing the existing evidence and its implications for policy. This paper contributes to the literature by bringing together the published evidence on the behavioural frameworks and evidence on the process leading to the public acceptance of genetically modified (GM) food and organisms (GMOs). In doing so, we employ a set of clearly defined search tools and a limited number of comprehensive key words. The study attempts to gather an understanding of the published findings on the determinants of the valuation of GM food - both in terms of willingness to accept and the willing-to-pay a premium for non-GM food, trust with information sources on the safety and public health and ultimate attitudes underpinning such evidence. Furthermore, in the light of such evidence, we formulate some policy strategies to deal with public uncertainly regarding to GMOs and, especially GM food. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The paper examines how European retailers are using private standards for food safety and,quality as risk management and competitive tools and the strategic responses of leading Kenyan and other developing country supplier/exporters to such standards. Despite measures to harmonize a 'single market', the European fresh produce market is very diverse in terms of consumer preferences, structural dynamics and attention to and enforcement of food safety and other standards. Leading Kenyan fresh produce suppliers have re-positioned themselves at the high end, including 'high care', segments of the market - precisely those that are most demanding in terms of quality assurance and food safety systems. An array of factors have influenced this strategic positioning, including relatively high international freight costs, the emergence of more effective competition in mainstream product lines, relatively low labor costs for produce preparation, and strong market relationships with selected retail chains. To succeed in this demanding market segment, the industry has had to invest substantially in improved production and procurement systems, upgraded pack house facilities, and quality assurance/food safety management systems. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This review considers microbial inocula used in in vitro systems from the perspective of their ability to degrade or ferment a particular substrate, rather than the microbial species that it contains. By necessity, this required an examination of bacterial, protozoal and fungal populations of the rumen and hindgut with respect to factors influencing their activity. The potential to manipulate these populations through diet or sampling time are examined, as is inoculum preparation and level. The main alternatives to fresh rumen fluid (i.e., caecal digesta or faeces) are discussed with respect to end-point degradabilities and fermentation dynamics. Although the potential to use rumen contents obtained from donor animals at slaughter offers possibilities, the requirement to store it and its subsequent loss of activity are limitations. Statistical modelling of data, although still requiring a deal of developmental work, may offer an alternative approach. Finally, with respect to the range of in vitro methodologies and equipment employed, it is suggested that a degree of uniformity could be obtained through generation of a set of guidelines relating to the host animal, sampling technique and inoculum preparation. It was considered unlikely that any particular system would be accepted as the 'standard' procedure. However, before any protocol can be adopted, additional data are required (e.g., a method to assess inoculum 'quality' with respect to its fermentative and/or degradative activity), preparation/inoculation techniques need to be refined and a methodology to store inocula without loss of efficacy developed. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The requirement to rapidly and efficiently evaluate ruminant feedstuffs places increased emphasis on in vitro systems. However, despite the developmental work undertaken and widespread application of such techniques, little attention has been paid to the incubation medium. Considerable research using in vitro systems is conducted in resource-poor developing countries that often have difficulties associated with technical expertise, sourcing chemicals and/or funding to cover analytical and equipment costs. Such limitations have, to date, restricted vital feed evaluation programmes in these regions. This paper examines the function and relevance of the buffer, nutrient, and reducing solution components within current in vitro media, with the aim of identifying where simplification can be achieved. The review, supported by experimental work, identified no requirement to change the carbonate or phosphate salts, which comprise the main buffer components. The inclusion of microminerals provided few additional nutrients over that already supplied by the rumen fluid and substrate, and so may be omitted. Nitrogen associated with the inoculum was insufficient to support degradation and a level of 25 mg N/g substrate is recommended. A sulphur inclusion level of 4-5 mg S/g substrate is proposed, with S levels lowered through omission of sodium sulphide and replacement of magnesium sulphate with magnesium chloride. It was confirmed that a highly reduced medium was not required, provided that anaerobic conditions were rapidly established. This allows sodium sulphide, part of the reducing solution, to be omitted. Further, as gassing with CO2 directly influences the quantity of gas released, it is recommended that minimum CO, levels be used and that gas flow and duration, together with the volume of medium treated, are detailed in experimental procedures. It is considered that these simplifications will improve safety and reduce costs and problems associated with sourcing components, while maintaining analytical precision. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
This document provides guidelines for fish stock assessment and fishery management using the software tools and other outputs developed by the United Kingdom's Department for International Development's Fisheries Management Science Programme (FMSP) from 1992 to 2004. It explains some key elements of the precautionary approach to fisheries management and outlines a range of alternative stock assessment approaches that can provide the information needed for such precautionary management. Four FMSP software tools, LFDA (Length Frequency Data Analysis), CEDA (Catch Effort Data Analysis), YIELD and ParFish (Participatory Fisheries Stock Assessment), are described with which intermediary parameters, performance indicators and reference points may be estimated. The document also contains examples of the assessment and management of multispecies fisheries, the use of Bayesian methodologies, the use of empirical modelling approaches for estimating yields and in analysing fishery systems, and the assessment and management of inland fisheries. It also provides a comparison of length- and age-based stock assessment methods. A CD-ROM with the FMSP software packages CEDA, LFDA, YIELD and ParFish is included.
Resumo:
Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.
Resumo:
Modern buildings are designed to enhance the match between environment, spaces and the people carrying out work, so that the well-being and the performance of the occupants are all in harmony. Building services are systems that facilitate a healthy working environment within which workers productivity can be optimised in the buildings. However, the maintenance of these services is fraught with problems that may contribute to up to 50% of the total life cycle cost of the building. Maintenance support is one area which is not usually designed into the system as this is not common practice in the services industry. The other areas of shortfall for future designs are; client requirements, commissioning, facilities management data and post occupancy evaluation feedback which needs to be adequately planned to capture and document this information for use in future designs. At the University of Reading an integrated approach has been developed to assemble the multitude of aspects inherent in this field. The means records required and measured achievements for the benefit of both building owners and practitioners. This integrated approach can be represented in a Through Life Business Model (TLBM) format using the concept of Integrated Logistic Support (ILS). The prototype TLBM developed utilises the tailored tools and techniques of ILS for building services. This TLBM approach will facilitate the successful development of a databank that would be invaluable in capturing essential data (e.g. reliability of components) for enhancing future building services designs, life cycle costing and decision making by practitioners, in particular facilities managers.
Resumo:
This paper presents a study on applying an integrated Global Position System (GPS) and Geographacial Information System (GIS) technology to the reduction of construction waste. During the study, a prototype study is developed from automatic data capture system such as the barcoding system for construction material and equipment (M&E) management onsite, whilst the integrated GPS and GIS technology is combined to the M&E system based on the Wide Area Network (WAN). Then, a case study is conducted to demonstrate the deployment of the system. Experimental results indicate that the proposed system can minimize the amount of onsite material wastage.
Resumo:
The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e.g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.
Resumo:
The capability of a feature model of immediate memory (Nairne, 1990; Neath, 2000) to predict and account for a relationship between absolute and proportion scoring of immediate serial recall when memory load is varied (the list-length effect, LLE) is examined. The model correctly predicts the novel finding of an LLE in immediate serial order memory similar to that observed with free recall and previously assumed to be attributable to the long-term memory component of that procedure (Glanzer, 1972). The usefulness of formal models as predictive tools and the continuity between short-term serial order and longer term item memory are considered.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
It's a fact that functional verification (FV) is paramount within the hardware's design cycle. With so many new techniques available today to help with FV, which techniques should we really use? The answer is not straightforward and is often confusing and costly. The tools and techniques to be used in a project have to be decided upon early in the design cycle to get the best value for these new verification methods. This paper gives a quick survey in the form of an overview on FV, establishes the difference between verification and validation, describes the bottlenecks that appear in the verification process, examines the challenges in FV and exposes the current FV technologies and trends.
Resumo:
Identity issues are under-explored in construction management. We provide a brief introduction to the organization studies literature on subjectively construed identities, focusing on discourse, agency, relations of power and identity work. The construction management literature is investigated in order to examine identity concerns as they relate to construction managers centred on (1) professionalism; (2) ethics; (3) relational aspects of self-identity; (4) competence, knowledge and tools; and (5) national culture. Identity, we argue, is a key performance issue, and needs to be accounted for in explanations of the success and failure of projects. Our overriding concern is to raise identity issues in order to demonstrate their importance to researchers in construction management and to spark debate. The purpose of this work is not to provide answers or to propose prescriptive models, but to explore ideas, raise awareness and to generate questions for further programmatic research. To this end, we promote empirical work and theorizing by outlining elements of a research agenda which argues that 'identity' is a potentially generative theme for scholars in construction management.
Resumo:
There are a number of challenges associated with managing knowledge and information in construction organizations delivering major capital assets. These include the ever-increasing volumes of information, losing people because of retirement or competitors, the continuously changing nature of information, lack of methods on eliciting useful knowledge, development of new information technologies and changes in management and innovation practices. Existing tools and methodologies for valuing intangible assets in fields such as engineering, project management and financial, accounting, do not address fully the issues associated with the valuation of information and knowledge. Information is rarely recorded in a way that a document can be valued, when either produced or subsequently retrieved and re-used. In addition there is a wealth of tacit personal knowledge which, if codified into documentary information, may prove to be very valuable to operators of the finished asset or future designers. This paper addresses the problem of information overload and identifies the differences between data, information and knowledge. An exploratory study was conducted with a leading construction consultant examining three perspectives (business, project management and document management) by structured interviews and specifically how to value information in practical terms. Major challenges in information management are identified. An through-life Information Evaluation methodology (IEM) is presented to reduce information overload and to make the information more valuable in the future.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society