40 resultados para Technical indexes
Resumo:
Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.
Resumo:
The past decade has witnessed a sharp increase in published research on energy and buildings. This paper takes stock of work in this area, with a particular focus on construction research and the analysis of non-technical dimensions. While there is widespread recognition as to the importance of non-technical dimensions, research tends to be limited to individualistic studies of occupants and occupant behavior. In contrast, publications in the mainstream social science literature display a broader range of interests, including policy developments, structural constraints on the diffusion and use of new technologies and the construction process itself. The growing interest of more generalist scholars in energy and buildings provides an opportunity for construction research to engage a wider audience. This would enrich the current research agenda, helping to address unanswered problems concerning the relatively weak impact of policy mechanisms and new technologies and the seeming recalcitrance of occupants. It would also help to promote the academic status of construction research as a field. This, in turn, depends on greater engagement with interpretivist types of analysis and theory building, thereby challenging deeply ingrained views on the nature and role of academic research in construction.
Resumo:
Housing in the UK accounts for 30.5% of all energy consumed and is responsible for 25% of all carbon emissions. The UK Government’s Code for Sustainable Homes requires all new homes to be zero carbon by 2016. The development and widespread diffusion of low and zero carbon (LZC) technologies is recognised as being a key solution for housing developers to deliver against this zero-carbon agenda. The innovation challenge to design and incorporate these technologies into housing developers’ standard design and production templates will usher in significant technical and commercial risks. In this paper we report early results from an ongoing Engineering and Physical Sciences Research Council project looking at the innovation logic and trajectory of LZC technologies in new housing. The principal theoretical lens for the research is the socio-technical network approach which considers actors’ interests and interpretative flexibilities of technologies and how they negotiate and reproduce ‘acting spaces’ to shape, in this case, the selection and adoption of LZC technologies. The initial findings are revealing the form and operation of the technology networks around new housing developments as being very complex, involving a range of actors and viewpoints that vary for each housing development.
Resumo:
A simple procedure was developed for packing PicoFrit HPLC columns with chromatographic stationary phase using a reservoir fabricated from standard laboratory HPLC fittings. Packed columns were mounted onto a stainless steel ultra-low volume precolumn filter assembly containing a 0.5-mu m pore size steel frit. This format provided a conduit for the application of the nanospray voltage and protected the column from obstruction by sample material. The system was characterised and operational performance assessed by analysis of a range of peptide standards (n = 9).
Resumo:
The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.