60 resultados para students that use drugs
Resumo:
Arbuscular mycorrhizal (AM) fungi have a variety of effects on foliar-feeding insects, with the majority of these being positive, although reports of negative and null effects also exist. Virtually all previous experiments have used mobile insects confined in cages and have studied the effects of one, or at most two, species of mycorrhizae on one species of insect. The purpose of this study was to introduce a greater level of realism into insect-mycorrhizal experiments, by studying the responses of different insect feeding guilds to a variety of AM fungi. We conducted two experiments involving three species of relatively immobile insects (a leaf-mining and two seed-feeding flies) reared in natural conditions on a host (Leucanthemum vulgare). In a field study, natural levels of AM colonization were reduced, while in a phytometer trial, we experimentally colonized host plants with all possible combinations of three known mycorrhizal associates of L. vulgare. In general, AM fungi increased the stature (height and leaf number) and nitrogen content of plants. However, these effects changed through the season and were,dependent on the identity of the fungi in the root system. AM fungi increased host acceptance of all three insects and larval performance of the leaf miner, but these effects were also season- and AM species-dependent. We suggest that the mycorrhizal effect on the performance of the leaf miner is due to fungal-induced changes in host-plant nitrogen content, detected by the adult fly. However, variability in the effect was apparent, because not all AM species increased plant N content. Meanwhile, positive effects of mycorrhizae were found on flower number and flower size, and these appeared to result in enhanced infestation levels by the seed-feeding insects. The results show that AM fungi exhibit ecological specificity, in that different. species have different effects on host-plant growth and chemistry and the performance of foliar-feeding insects. Future studies need to conduct experiments that use ecologically realistic combinations of plants and fungi and allow insects to be reared in natural conditions.
In vitro cumulative gas production techniques: History, methodological considerations and challenges
Resumo:
Methodology used to measure in vitro gas production is reviewed to determine impacts of sources of variation on resultant gas production profiles (GPP). Current methods include measurement of gas production at constant pressure (e.g., use of gas tight syringes), a system that is inexpensive, but may be less sensitive than others thereby affecting its suitability in some situations. Automated systems that measure gas production at constant volume allow pressure to accumulate in the bottle, which is recorded at different times to produce a GPP, and may result in sufficiently high pressure that solubility of evolved gases in the medium is affected, thereby resulting in a recorded volume of gas that is lower than that predicted from stoichiometric calculations. Several other methods measure gas production at constant pressure and volume with either pressure transducers or sensors, and these may be manual, semi-automated or fully automated in operation. In these systems, gas is released as pressure increases, and vented gas is recorded. Agitating the medium does not consistently produce more gas with automated systems, and little or no effect of agitation was observed with manual systems. The apparatus affects GPP, but mathematical manipulation may enable effects of apparatus to be removed. The amount of substrate affects the volume of gas produced, but not rate of gas production, provided there is sufficient buffering capacity in the medium. Systems that use a very small amount of substrate are prone to experimental error in sample weighing. Effect of sample preparation on GPP has been found to be important, but further research is required to determine the optimum preparation that mimics animal chewing. Inoculum is the single largest source of variation in measuring GPP, as rumen fluid is variable and sampling schedules, diets fed to donor animals and ratios of rumen fluid/medium must be selected such that microbial activity is sufficiently high that it does not affect rate and extent of fermentation. Species of donor animal may also cause differences in GPP. End point measures can be mathematically manipulated to account for species differences, but rates of fermentation are not related. Other sources of inocula that have been used include caecal fluid (primarily for investigating hindgut fermentation in monogastrics), effluent from simulated rumen fermentation (e.g., 'Rusitec', which was as variable as rumen fluid), faeces, and frozen or freeze-dried rumen fluid (which were both less active than fresh rumen fluid). Use of mixtures of cell-free enzymes, or pure cultures of bacteria, may be a way of increasing GPP reproducibility, while reducing reliance on surgically modified animals. However, more research is required to develop these inocula. A number of media have been developed which buffer the incubation and provide relevant micro-nutrients to the microorganisms. To date, little research has been completed on relationships between the composition of the medium and measured GPP. However, comparing GPP from media either rich in N or N-free, allows assessment of contributions of N containing compounds in the sample. (c) 2005 Published by Elsevier B.V.
Resumo:
A sequential study design generally makes more efficient use of available information than a fixed sample counterpart of equal power. This feature is gradually being exploited by researchers in genetic and epidemiological investigations that utilize banked biological resources and in studies where time, cost and ethics are prominent considerations. Recent work in this area has focussed on the sequential analysis of matched case-control studies with a dichotomous trait. In this paper, we extend the sequential approach to a comparison of the associations within two independent groups of paired continuous observations. Such a comparison is particularly relevant in familial studies of phenotypic correlation using twins. We develop a sequential twin method based on the intraclass correlation and show that use of sequential methodology can lead to a substantial reduction in the number of observations without compromising the study error rates. Additionally, our approach permits straightforward allowance for other explanatory factors in the analysis. We illustrate our method in a sequential heritability study of dysplasia that allows for the effect of body mass index and compares monozygotes with pairs of singleton sisters. Copyright (c) 2006 John Wiley & Sons, Ltd.
Resumo:
The paper considers meta-analysis of diagnostic studies that use a continuous Score for classification of study participants into healthy, or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between Studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might he confounded by a potentially unknown variation of the cut-off Value. To cope with this phenomena it is suggested to use, instead an overall estimate of the misclassification error previously suggested and used as Youden's index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel-Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden's index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.
Resumo:
Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)
Resumo:
OBJECTIVES: To evaluate the evidence for strategies to prevent falls or fractures in residents in care homes and hospital inpatients and to investigate the effect of dementia and cognitive impairment. DESIGN: Systematic review and meta-analyses of studies grouped by intervention and setting (hospital or care home). Meta-regression to investigate the effects of dementia and of study quality and design. DATA SOURCES: Medline, CINAHL, Embase, PsychInfo, Cochrane Database, Clinical Trials Register, and hand searching of references from reviews and guidelines to January 2005. RESULTS: 1207 references were identified, including 115 systematic reviews, expert reviews, or guidelines. Of the 92 full papers inspected, 43 were included. Meta-analysis for multifaceted interventions in hospital (13 studies) showed a rate ratio of 0.82 (95% confidence interval 0.68 to 0.997) for falls but no significant effect on the number of fallers or fractures. For hip protectors in care homes (11 studies) the rate ratio for hip fractures was 0.67 (0.46 to 0.98), but there was no significant effect on falls and not enough studies on fallers. For all other interventions (multifaceted interventions in care homes; removal of physical restraints in either setting; fall alarm devices in either setting; exercise in care homes; calcium/vitamin D in care homes; changes in the physical environment in either setting; medication review in hospital) meta-analysis was either unsuitable because of insufficient studies or showed no significant effect on falls, fallers, or fractures, despite strongly positive results in some individual studies. Meta-regression showed no significant association between effect size and prevalence of dementia or cognitive impairment. CONCLUSION: There is some evidence that multifaceted interventions in hospital reduce the number of falls and that use of hip protectors in care homes prevents hip fractures. There is insufficient evidence, however, for the effectiveness of other single interventions in hospitals or care homes or multifaceted interventions in care homes.
Resumo:
Unlike other positive-stranded RNA viruses that use either a 5'-cap structure or an internal ribosome entry site to direct translation of their messenger RNA, calicivirus translation is dependent on the presence of a protein covalently linked to the 50 end of the viral genome (VPg). We have shown a direct interaction of the calicivirus VPg with the cap-binding protein eIF4E. This interaction is required for calicivirus mRNA translation, as sequestration of eIF4E by 4E-BP1 inhibits translation. Functional analysis has shown that VPg does not interfere with the interaction between eIF4E and the cap structure or 4E-BP1, suggesting that VPg binds to eIF4E at a different site from both cap and 4E-BP1. This work lends support to the idea that calicivirus VPg acts as a novel 'cap substitute' during initiation of translation on virus mRNA.
Recent developments in genetic data analysis: what can they tell us about human demographic history?
Resumo:
Over the last decade, a number of new methods of population genetic analysis based on likelihood have been introduced. This review describes and explains the general statistical techniques that have recently been used, and discusses the underlying population genetic models. Experimental papers that use these methods to infer human demographic and phylogeographic history are reviewed. It appears that the use of likelihood has hitherto had little impact in the field of human population genetics, which is still primarily driven by more traditional approaches. However, with the current uncertainty about the effects of natural selection, population structure and ascertainment of single-nucleotide polymorphism markers, it is suggested that likelihood-based methods may have a greater impact in the future.
Resumo:
Background: Robot-mediated therapies offer entirely new approaches to neurorehabilitation. In this paper we present the results obtained from trialling the GENTLE/S neurorehabilitation system assessed using the upper limb section of the Fugl-Meyer ( FM) outcome measure. Methods: We demonstrate the design of our clinical trial and its results analysed using a novel statistical approach based on a multivariate analytical model. This paper provides the rational for using multivariate models in robot-mediated clinical trials and draws conclusions from the clinical data gathered during the GENTLE/S study. Results: The FM outcome measures recorded during the baseline ( 8 sessions), robot-mediated therapy ( 9 sessions) and sling-suspension ( 9 sessions) was analysed using a multiple regression model. The results indicate positive but modest recovery trends favouring both interventions used in GENTLE/S clinical trial. The modest recovery shown occurred at a time late after stroke when changes are not clinically anticipated. Conclusion: This study has applied a new method for analysing clinical data obtained from rehabilitation robotics studies. While the data obtained during the clinical trial is of multivariate nature, having multipoint and progressive nature, the multiple regression model used showed great potential for drawing conclusions from this study. An important conclusion to draw from this paper is that this study has shown that the intervention and control phase both caused changes over a period of 9 sessions in comparison to the baseline. This might indicate that use of new challenging and motivational therapies can influence the outcome of therapies at a point when clinical changes are not expected. Further work is required to investigate the effects arising from early intervention, longer exposure and intensity of the therapies. Finally, more function-oriented robot-mediated therapies or sling-suspension therapies are needed to clarify the effects resulting from each intervention for stroke recovery.
Resumo:
A range of forecasts of global oil production made between 1956 and the present day are listed. For the majority of these the methodology used to generate the forecast is described. The paper distinguishes between three types of forecast: group 1-quantitative analyses which predict that global oil production will reach a resource-limited peak in the near term, and certainly before the year 2020; group 2-forecasts that use quantitative methods, but which see no production peak within the forecast's time horizon (typically 2020 or 2030); group 3-nonquantitative analyses that rule out a resource-limited oil peak within the foreseeable future. The paper analyses these forecast types and suggests that group 1 forecasts are the most realistic.
An empirical study of process-related attributes in segmented software cost-estimation relationships
Resumo:
Parametric software effort estimation models consisting on a single mathematical relationship suffer from poor adjustment and predictive characteristics in cases in which the historical database considered contains data coming from projects of a heterogeneous nature. The segmentation of the input domain according to clusters obtained from the database of historical projects serves as a tool for more realistic models that use several local estimation relationships. Nonetheless, it may be hypothesized that using clustering algorithms without previous consideration of the influence of well-known project attributes misses the opportunity to obtain more realistic segments. In this paper, we describe the results of an empirical study using the ISBSG-8 database and the EM clustering algorithm that studies the influence of the consideration of two process-related attributes as drivers of the clustering process: the use of engineering methodologies and the use of CASE tools. The results provide evidence that such consideration conditions significantly the final model obtained, even though the resulting predictive quality is of a similar magnitude.
Resumo:
The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.
Resumo:
Enhanced release of CO2 to the atmosphere from soil organic carbon as a result of increased temperatures may lead to a positive feedback between climate change and the carbon cycle, resulting in much higher CO2 levels and accelerated lobal warming. However, the magnitude of this effect is uncertain and critically dependent on how the decomposition of soil organic C (heterotrophic respiration) responds to changes in climate. Previous studies with the Hadley Centre’s coupled climate–carbon cycle general circulation model (GCM) (HadCM3LC) used a simple, single-pool soil carbon model to simulate the response. Here we present results from numerical simulations that use the more sophisticated ‘RothC’ multipool soil carbon model, driven with the same climate data. The results show strong similarities in the behaviour of the two models, although RothC tends to simulate slightly smaller changes in global soil carbon stocks for the same forcing. RothC simulates global soil carbon stocks decreasing by 54 GtC by 2100 in a climate change simulation compared with an 80 GtC decrease in HadCM3LC. The multipool carbon dynamics of RothC cause it to exhibit a slower magnitude of transient response to both increased organic carbon inputs and changes in climate. We conclude that the projection of a positive feedback between climate and carbon cycle is robust, but the magnitude of the feedback is dependent on the structure of the soil carbon model.
Resumo:
Carsberg (2002) suggested that the periodic valuation accuracy studies undertaken by, amongst others, IPD/Drivers Jonas (2003) should be undertaken every year and be sponsored by the RICS, which acts as the self-regulating body for valuations in the UK. This paper does not address the wider issues concerning the nature of properties which are sold and whether the sale prices are influenced by prior valuations, but considers solely the technical issues concerning the timing of the valuation and sales data. This study uses valuations and sales data from the Investment Property Databank UK Monthly Index to attempt to identify the date that sale data is divulged to valuers. This information will inform accuracy studies that use a cut-off date as to the closeness of valuations to sales completion date as a yardstick for excluding data from the analysis. It will also, assuming valuers are informed quickly of any agreed sales, help to determine the actual sale agreed date rather than the completion date, which includes a period of due diligence between when the sale is agreed and its completion. Valuations should be updated to this date, rather than the formal completion date, if a reliable measure of valuation accuracy is to be determined. An accuracy study is then undertaken using a variety of updating periods and the differences between the results are examined. The paper concludes that the sale only becomes known to valuers in the month prior to the sale taking place and that this assumes either that sales due diligence procedures are shortening or valuers are not told quickly of agreed sale prices. Studies that adopt a four-month cut-off date for any valuations compared to sales completion dates are over cautious, and this could be reduced to two months without compromising the data.
Resumo:
The water vapour continuum is characterised by absorption that varies smoothly with wavelength, from the visible to the microwave. It is present within the rotational and vibrational–rotational bands of water vapour, which consist of large numbers of narrow spectral lines, and in the many ‘windows’ between these bands. The continuum absorption in the window regions is of particular importance for the Earth’s radiation budget and for remote-sensing techniques that exploit these windows. Historically, most attention has focused on the 8–12 μm (mid-infrared) atmospheric window, where the continuum is relatively well-characterised, but there have been many fewer measurements within bands and in other window regions. In addition, the causes of the continuum remain a subject of controversy. This paper provides a brief historical overview of the development of understanding of the continuum and then reviews recent developments, with a focus on the near-infrared spectral region. Recent laboratory measurements in near-infrared windows, which reveal absorption typically an order of magnitude stronger than in widely used continuum models, are shown to have important consequences for remote-sensing techniques that use these windows for retrieving cloud properties.