914 resultados para Using Lean tools
Resumo:
Recent molecular-based investigations have confirmed the species diversity and metabolic complexity of the human gut microbiota. It is also increasingly clear that the human gut microbiota plays a crucial role in host health, both as a source of infection and environmental insult and, conversely, in protection against disease and maintenance of gut function. Although little is known about the health impact of the dominant groups of gut bacteria it is generally accepted that bifidobacteria and lactobacilli are important components of what might be termed the beneficial gut microbiota. The microbiota management tools of probiotics, prebiotics and synbiotics have been developed and, indeed, commercialized over the past few decades with the expressed purpose of increasing numbers of bifidobacteria and/or lactobacilli within the gastrointestinal tract.
Resumo:
It is well established that brain ischemia can cause neuronal death via different signaling cascades. The relative importance and interrelationships between these pathways, however, remain poorly understood. Here is presented an overview of studies using oxygen-glucose deprivation of organotypic hippocampal slice cultures to investigate the molecular mechanisms involved in ischemia. The culturing techniques, setup of the oxygen-glucose deprivation model, and analytical tools are reviewed. The authors focus on SUMOylation, a posttranslational protein modification that has recently been implicated in ischemia from whole animal studies as an example of how these powerful tools can be applied and could be of interest to investigate the molecular pathways underlying ischemic cell death.
Resumo:
In situ analysis has become increasingly important for contaminated land investigation and remediation. At present, portable techniques are used mainly as scanning tools to assess the spread and magnitude of the contamination, and are an adjunct to conventional laboratory analyses. A site in Cornwall, containing naturally occurring radioactive material (NORM), provided an opportunity for Reading University PhD student Anna Kutner to compare analytical data collected in situ with data generated by laboratory-based methods. The preliminary results in this paper extend the author‟s poster presentation at last September‟s GeoSpec2010 conference held in Lancaster.
Resumo:
The objective of this study was to determine the potential of mid-infrared spectroscopy coupled with multidimensional statistical analysis for the prediction of processed cheese instrumental texture and meltability attributes. Processed cheeses (n = 32) of varying composition were manufactured in a pilot plant. Following two and four weeks storage at 4 degrees C samples were analysed using texture profile analysis, two meltability tests (computer vision, Olson and Price) and mid-infrared spectroscopy (4000-640 cm(-1)). Partial least squares regression was used to develop predictive models for all measured attributes. Five attributes were successfully modelled with varying degrees of accuracy. The computer vision meltability model allowed for discrimination between high and low melt values (R-2 = 0.64). The hardness and springiness models gave approximate quantitative results (R-2 = 0.77) and the cohesiveness (R-2 = 0.81) and Olson and Price meltability (R-2 = 0.88) models gave good prediction results. (c) 2006 Elsevier Ltd. All rights reserved..
Resumo:
There are approximately 29,000 ha of grass buffer strips in the UK under Agri-Environment Schemes; however, typically they are floristically poor and as such are of limited biodiversity value. Introducing a sown wildflower component has the potential to increase dramatically the value of these buffer strips for a suite of native species, including butterflies. This study investigates management practices aiming to promote the establishment and maintenance of wildflowers in existing buffer strips. The effectiveness of two methods used to increase the establishment of wildflowers for the benefit of native butterfly species were tested, both individually and in combination. The management practices were: (1) the application of a selective graminicide (fluazifop-P-butyl) which reduces the dominance of competitive grasses; and (2) scarification of the soil which creates germination niches for sown wildflower seeds. A wildflower seed mix consisting of nine species was sown in conjunction with the scarification treatment. Responses of wildflowers and butterflies were monitored for two years after establishment. Results indicate that the combined scarification and graminicide treatment produced the greatest cover and species richness of sown wildflowers. Butterfly abundance, species richness and diversity were positively correlated with sown wildflower species richness, with the highest values in the combined scarification and graminicide treatment. These findings have confirmed the importance of both scarification as a means of introducing wildflower seed into existing buffer strips, and subsequent management using graminicides, for the benefit of butterflies. Application of this approach could provide tools to help butterfly conservation on farmland in the future.
Resumo:
This research establishes the feasibility of using a network centric technology, Jini, to provide a grid framework on which to perform parallel video encoding. A solution was implemented using Jini and obtained real-time on demand encoding of a 480 HD video stream. Further, a projection is made concerning the encoding of 1080 HD video in real-time, as the current grid was not powerful enough to achieve this above 15fps. The research found that Jini is able to provide a number of tools and services highly applicable in a grid environment. It is also suitable in terms of performance and responds well to a varying number of grid nodes. The main performance limiter was found to be the network bandwidth allocation, which when loaded with a large number of grid nodes was unable to handle the traffic.
Resumo:
The development of an Artificial Neural Network model of UK domestic appliance energy consumption is presented. The model uses diary-style appliance use data and a survey questionnaire collected from 51 households during the summer of 2010. It also incorporates measured energy data and is sensitive to socioeconomic, physical dwelling and temperature variables. A prototype model is constructed in MATLAB using a two layer feed forward network with backpropagation training and has a12:10:24architecture.Model outputs include appliance load profiles which can be applied to the fields of energy planning (micro renewables and smart grids), building simulation tools and energy policy.
Resumo:
Motivation: The ability of a simple method (MODCHECK) to determine the sequence–structure compatibility of a set of structural models generated by fold recognition is tested in a thorough benchmark analysis. Four Model Quality Assessment Programs (MQAPs) were tested on 188 targets from the latest LiveBench-9 automated structure evaluation experiment. We systematically test and evaluate whether the MQAP methods can successfully detect native-likemodels. Results: We show that compared with the other three methods tested MODCHECK is the most reliable method for consistently performing the best top model selection and for ranking the models. In addition, we show that the choice of model similarity score used to assess a model's similarity to the experimental structure can influence the overall performance of these tools. Although these MQAP methods fail to improve the model selection performance for methods that already incorporate protein three dimension (3D) structural information, an improvement is observed for methods that are purely sequence-based, including the best profile–profile methods. This suggests that even the best sequence-based fold recognition methods can still be improved by taking into account the 3D structural information.
Resumo:
Abstract: Following a workshop exercise, two models, an individual-based landscape model (IBLM) and a non-spatial life-history model were used to assess the impact of a fictitious insecticide on populations of skylarks in the UK. The chosen population endpoints were abundance, population growth rate, and the chances of population persistence. Both models used the same life-history descriptors and toxicity profiles as the basis for their parameter inputs. The models differed in that exposure was a pre-determined parameter in the life-history model, but an emergent property of the IBLM, and the IBLM required a landscape structure as an input. The model outputs were qualitatively similar between the two models. Under conditions dominated by winter wheat, both models predicted a population decline that was worsened by the use of the insecticide. Under broader habitat conditions, population declines were only predicted for the scenarios where the insecticide was added. Inputs to the models are very different, with the IBLM requiring a large volume of data in order to achieve the flexibility of being able to integrate a range of environmental and behavioural factors. The life-history model has very few explicit data inputs, but some of these relied on extensive prior modelling needing additional data as described in Roelofs et al.(2005, this volume). Both models have strengths and weaknesses; hence the ideal approach is that of combining the use of both simple and comprehensive modeling tools.
Resumo:
Objective: To describe the training undertaken by pharmacists employed in a pharmacist-led information technology-based intervention study to reduce medication errors in primary care (PINCER Trial), evaluate pharmacists’ assessment of the training, and the time implications of undertaking the training. Methods: Six pharmacists received training, which included training on root cause analysis and educational outreach, to enable them to deliver the PINCER Trial intervention. This was evaluated using self-report questionnaires at the end of each training session. The time taken to complete each session was recorded. Data from the evaluation forms were entered onto a Microsoft Excel spreadsheet, independently checked and the summary of results further verified. Frequencies were calculated for responses to the three-point Likert scale questions. Free-text comments from the evaluation forms and pharmacists’ diaries were analysed thematically. Key findings: All six pharmacists received 22 hours of training over five sessions. In four out of the five sessions, the pharmacists who completed an evaluation form (27 out of 30 were completed) stated they were satisfied or very satisfied with the various elements of the training package. Analysis of free-text comments and the pharmacists’ diaries showed that the principles of root cause analysis and educational outreach were viewed as useful tools to help pharmacists conduct pharmaceutical interventions in both the study and other pharmacy roles that they undertook. The opportunity to undertake role play was a valuable part of the training received. Conclusions: Findings presented in this paper suggest that providing the PINCER pharmacists with training in root cause analysis and educational outreach contributed to the successful delivery of PINCER interventions and could potentially be utilised by other pharmacists based in general practice to deliver pharmaceutical interventions to improve patient safety.
Resumo:
Earth system models are increasing in complexity and incorporating more processes than their predecessors, making them important tools for studying the global carbon cycle. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes, with coupled climate-carbon cycle models that represent land-use change simulating total land carbon stores by 2100 that vary by as much as 600 Pg C given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous model evaluation methodologies. Here we assess the state-of-the-art with respect to evaluation of Earth system models, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeo data and (ii) metrics for evaluation, and discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute towards the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but it is also a challenge, as more knowledge about data uncertainties is required in order to determine robust evaluation methodologies that move the field of ESM evaluation from "beauty contest" toward the development of useful constraints on model behaviour.
Resumo:
To calculate the potential wind loading on a tall building in an urban area, an accurate representation of the wind speed profile is required. However, due to a lack of observations, wind engineers typically estimate the characteristics of the urban boundary layer by translating the measurements from a nearby reference rural site. This study presents wind speed profile data obtained from a Doppler lidar in central London, UK, during an 8 month observation period. Used in conjunction with wind speed data measured at a nearby airport, the data have been used to assess the accuracy of the predictions made by the wind engineering tools currently available. When applied to multiple changes in surface roughness identified from morphological parameters, the non-equilibrium wind speed profile model developed by Deaves (1981) provides a good representation of the urban wind speed profile. For heights below 500 m, the predicted wind speed remains within the 95% confidence interval of the measured data. However, when the surface roughness is estimated using land use as a proxy, the model tends to overestimate the wind speed, particularly for very high wind speed periods. These results highlight the importance of a detailed assessment of the nature of the surface when estimating the wind speed above an urban surface.
Resumo:
Cities, which are now inhabited by a majority of the world's population, are not only an important source of global environmental and resource depletion problems, but can also act as important centres of technological innovation and social learning in the continuing quest for a low carbon future. Planning and managing large-scale transitions in cities to deal with these pressures require an understanding of urban retrofitting at city scale. In this context performative techniques (such as backcasting and roadmapping) can provide valuable tools for helping cities develop a strategic view of the future. However, it is also important to identify ‘disruptive’ and ‘sustaining’ technologies which may contribute to city-based sustainability transitions. This paper presents research findings from the EPSRC Retrofit 2050 project, and explores the relationship between technology roadmaps and transition theory literature, highlighting the research gaps at urban/city level. The paper develops a research methodology to describe the development of three guiding visions for city-regional retrofit futures, and identifies key sustaining and disruptive technologies at city scale within these visions using foresight (horizon scanning) techniques. The implications of the research for city-based transition studies and related methodologies are discussed.
Resumo:
Earth system models (ESMs) are increasing in complexity by incorporating more processes than their predecessors, making them potentially important tools for studying the evolution of climate and associated biogeochemical cycles. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes. For example, coupled climate–carbon cycle models that represent land-use change simulate total land carbon stores at 2100 that vary by as much as 600 Pg C, given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous methods of model evaluation. Here we assess the state-of-the-art in evaluation of ESMs, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeodata and (ii) metrics for evaluation. We note that the practice of averaging results from many models is unreliable and no substitute for proper evaluation of individual models. We discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute to the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but also presents a challenge. Improved knowledge of data uncertainties is still necessary to move the field of ESM evaluation away from a "beauty contest" towards the development of useful constraints on model outcomes.