918 resultados para examination guidelines
Resumo:
Orlistat is an anti-obesity treatment with which several gastrointestinal (GI) side-effects are commonly associated in the initial stages of therapy. There is no physiological explanation as to why two-thirds of those who take the drug experience one or more side-effects. It has been hypothesized that the GI microbiota may protect from or contribute to these GI disturbances. Using in vitro batch culture and human gut model systems, studies were conducted to determine whether increased availability of dietary lipids and/or orlistat affect the composition and/or activity of the faecal microbiota. Results from 24-h batch culture fermentation experiments demonstrated no effect of orlistat in the presence or absence of a dietary lipid (olive oil) on the composition of bacterial communities [as determined by fluorescence in situ hybridization (FISH) and denaturing gradient gel electrophoresis (DGGE) analyses], but did show there was great variability in the lipolytic activities of the microbiotas of individuals, as determined by gas chromatography analysis of long-chain fatty acids in samples. Subsequent studies focused on the effect of orlistat in the presence and absence of lipid in in vitro human gut model systems. Systems were run for 14 days with gut model medium (GMM) only (to steady state, SS), then fed at 12-h intervals with 50 mg orlistat, 2 g olive oil or a mixture of both for 14 days. FISH and DGGE were used to monitor changes in bacterial populations. Bacteria were cultivated from the GMM only (control) systems at SS. All strains isolated were screened for lipolytic activity using tributyrin agar. FISH and DGGE demonstrated that none of the compounds (singly or in combination) added to the systems had any notable effect on microbial population dynamics for any of the donors, although Subdoligranulum populations appeared to be inhibited by orlistat in the presence or absence of lipid. Orlistat had little or no effect on the metabolism of indigenous and added lipids in the fermentation systems, but there was great variability in the way the faecal microbiotas of the donors were able to degrade added lipids. Variability in lipid degradation could be correlated with the number and activity of isolated lipolytic bacteria. The mechanism by which orlistat and the GI microbiota cause side-effects in individuals is unknown, but several hypotheses have been proposed to account for their manifestation. The demonstration of great variability in the lipolytic activity of microbiotas to degrade lipids led to a large-scale cultivation-based study of lipolytic/lipase-positive bacteria present in the human faecal microbiota. Of 4,000 colonies isolated from 15 donors using five different agars, 378 strains were identified that had lipase activity. Molecular identification of strains isolated from five donors demonstrated that lipase activity is more prevalent in the human GI microbiota than previously thought, with members of the phyla Firmicutes, Bacteroidetes and Actinobacteria identified. Molecular identification and characterization of the substrate specificities of the strains will be carried out as part of ongoing work.
Resumo:
Developing and implementing a technology for Facilities Management (FM) can be a complex process. This is particularly the case when a technology impacts on an organisation as a whole. There are often a number of relevant actors, internal and external to FM, who should be engaged. This engagement is guided by the strategy of the organisation which is led by top management decisions. Indeed, it is top management who have the final decision to implement a technology. Actors of top management and other relevant actors will have their own discourses toward the implementation of the technology based on how they foresee the technology befittingly benefitting the organisation. This paper examines actors who play a relevant and necessary part in supporting and implementing a technology to FM. It examines how an actor’s discourse toward the project inhibits or speeds up the implementation of a technology. The methods used for this paper are based on a two year case study in a FM department where a technology development was observed and interviews with key participants were conducted. Critical discourse analysis is used to analyse the data. Prominent discourses that emerge from the data are emphasised during the process of introducing the technology. This research moves beyond focusing purely on project successes but examines the difficulties and the hurdles that must be overcome to reach a successful technology implementation.
Resumo:
Quadratic programming techniques were applied to household food consumption data in England and Wales to estimate likely changes in diet under healthy eating guidelines, and the consequences this would have on agriculture and land use in England and Wales. The first step entailed imposing nutrient restrictions on food consumption following dietary recommendations suggested by the UK Department of Health. The resulting diet was used, in a second step as a proxy for demand in agricultural commodities, to test the impact of such a scenario on food production and land use in England and Wales and the impacts of this on agricultural landscapes. Results of the diet optimisation indicated a large drop in consumption of foods rich in saturated fats and sugar, essentially cheese and sugar-based products, along with lesser cuts of fat and meat products. Conversely, consumption of fruit and vegetables, cereals, and flour would increase to meet dietary fibre recommendations. Such a shift in demand would dramatically affect production patterns: the financial net margin of England and Wales agriculture would rise, due to increased production of high market value and high economic margin crops. Some regions would, however, be negatively affected, mostly those dependent on beef cattle and sheep production that could not benefit from an increased demand for cereals and horticultural crops. The effects of these changes would also be felt in upstream industries, such as animal feed suppliers. While arable dominated landscapes would be little affected, pastoral landscapes would suffer through loss of grazing management and, possibly, land abandonment, especially in upland areas.
Resumo:
Techniques for the coherent generation and detection of electromagnetic radiation in the far infrared, or terahertz, region of the electromagnetic spectrum have recently developed rapidly and may soon be applied for in vivo medical imaging. Both continuous wave and pulsed imaging systems are under development, with terahertz pulsed imaging being the more common method. Typically a pump and probe technique is used, with picosecond pulses of terahertz radiation generated from femtosecond infrared laser pulses, using an antenna or nonlinear crystal. After interaction with the subject either by transmission or reflection, coherent detection is achieved when the terahertz beam is combined with the probe laser beam. Raster scanning of the subject leads to an image data set comprising a time series representing the pulse at each pixel. A set of parametric images may be calculated, mapping the values of various parameters calculated from the shape of the pulses. A safety analysis has been performed, based on current guidelines for skin exposure to radiation of wavelengths 2.6 µm–20 mm (15 GHz–115 THz), to determine the maximum permissible exposure (MPE) for such a terahertz imaging system. The international guidelines for this range of wavelengths are drawn from two U.S. standards documents. The method for this analysis was taken from the American National Standard for the Safe Use of Lasers (ANSI Z136.1), and to ensure a conservative analysis, parameters were drawn from both this standard and from the IEEE Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields (C95.1). The calculated maximum permissible average beam power was 3 mW, indicating that typical terahertz imaging systems are safe according to the current guidelines. Further developments may however result in systems that will exceed the calculated limit. Furthermore, the published MPEs for pulsed exposures are based on measurements at shorter wavelengths and with pulses of longer duration than those used in terahertz pulsed imaging systems, so the results should be treated with caution.
Resumo:
Satellite data are used to quantify and examine the bias in the outgoing long-wave (LW) radiation over North Africa during May–July simulated by a range of climate models and the Met Office global numerical weather prediction (NWP) model. Simulations from an ensemble-mean of multiple climate models overestimate outgoing clear-sky long-wave radiation (LWc) by more than 20 W m−2 relative to observations from Clouds and the Earth's Radiant Energy System (CERES) for May–July 2000 over parts of the west Sahara, and by 9 W m−2 for the North Africa region (20°W–30°E, 10–40°N). Experiments with the atmosphere-only version of the High-resolution Hadley Centre Global Environment Model (HiGEM), suggest that including mineral dust radiative effects removes this bias. Furthermore, only by reducing surface temperature and emissivity by unrealistic amounts is it possible to explain the magnitude of the bias. Comparing simulations from the Met Office NWP model with satellite observations from Geostationary Earth Radiation Budget (GERB) instruments suggests that the model overestimates the LW by 20–40 W m−2 during North African summer. The bias declines over the period 2003–2008, although this is likely to relate to improvements in the model and inhomogeneity in the satellite time series. The bias in LWc coincides with high aerosol dust loading estimated from the Ozone Monitoring Instrument (OMI), including during the GERBILS field campaign (18–28 June 2007) where model overestimates in LWc greater than 20 W m−2 and OMI-estimated aerosol optical depth (AOD) greater than 0.8 are concurrent around 20°N, 0–20°W. A model-minus-GERB LW bias of around 30 W m−2 coincides with high AOD during the period 18–21 June 2007, although differences in cloud cover also impact the model–GERB differences. Copyright © Royal Meteorological Society and Crown Copyright, 2010
Resumo:
The volume–volatility relationship during the dissemination stages of information flow is examined by analyzing various theories relating volume and volatility as complementary rather than competing models. The mixture of distributions hypothesis, sequential arrival of information hypothesis, the dispersion of beliefs hypothesis, and the noise trader hypothesis all add to the understanding of how volume and volatility interact for different types of futures traders. An integrated picture of the volume–volatility relationship is provided by investigating the dynamic linear and nonlinear associations between volatility and the volume of informed (institutional) and uninformed (the general public) traders. In particular, the trading behavior explanation for the persistence of futures volatility, the effect of the timing of private information arrival, and the response of institutional traders to excess noise trading risk is examined