135 resultados para “omics” approaches

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e. g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e.g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stchastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, nonetheless SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS community, simulating the transitions between active and suppressed periods of tropical convection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss and test the potential usefulness of single-column models (SCMs) for the testing of stochastic physics schemes that have been proposed for use in general circulation models (GCMs). We argue that although single column tests cannot be definitive in exposing the full behaviour of a stochastic method in the full GCM, and although there are differences between SCM testing of deterministic and stochastic methods, SCM testing remains a useful tool. It is necessary to consider an ensemble of SCM runs produced by the stochastic method. These can be usefully compared to deterministic ensembles describing initial condition uncertainty and also to combinations of these (with structural model changes) into poor man's ensembles. The proposed methodology is demonstrated using an SCM experiment recently developed by the GCSS (GEWEX Cloud System Study) community, simulating transitions between active and suppressed periods of tropical convection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In an immersive virtual environment, observers fail to notice the expansion of a room around them and consequently make gross errors when comparing the size of objects. This result is difficult to explain if the visual system continuously generates a 3-D model of the scene based on known baseline information from interocular separation or proprioception as the observer walks. An alternative is that observers use view-based methods to guide their actions and to represent the spatial layout of the scene. In this case, they may have an expectation of the images they will receive but be insensitive to the rate at which images arrive as they walk. We describe the way in which the eye movement strategy of animals simplifies motion processing if their goal is to move towards a desired image and discuss dorsal and ventral stream processing of moving images in that context. Although many questions about view-based approaches to scene representation remain unanswered, the solutions are likely to be highly relevant to understanding biological 3-D vision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider the estimation of population size from onesource capture–recapture data, that is, a list in which individuals can potentially be found repeatedly and where the question is how many individuals are missed by the list. As a typical example, we provide data from a drug user study in Bangkok from 2001 where the list consists of drug users who repeatedly contact treatment institutions. Drug users with 1, 2, 3, . . . contacts occur, but drug users with zero contacts are not present, requiring the size of this group to be estimated. Statistically, these data can be considered as stemming from a zero-truncated count distribution.We revisit an estimator for the population size suggested by Zelterman that is known to be robust under potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a locally truncated Poisson likelihood which is equivalent to a binomial likelihood. This result allows the extension of the Zelterman estimator by means of logistic regression to include observed heterogeneity in the form of covariates. We also review an estimator proposed by Chao and explain why we are not able to obtain similar results for this estimator. The Zelterman estimator is applied in two case studies, the first a drug user study from Bangkok, the second an illegal immigrant study in the Netherlands. Our results suggest the new estimator should be used, in particular, if substantial unobserved heterogeneity is present.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transgenic crops are now grown commercially on several million hectares, principally in North America. To date, the predominant crops are maize (corn), soybean, cotton, and potatoes. In addition, there have been field trials of transgenics from at least 52 species including all the major field crops, vegetables, and several herbaceous and woody species. This review summarizes recent data relating to such trials, particularly in terms of the trends away from simple, single gene traits such as herbicide and insect resistance towards more complex agronomic traits such as growth rate and increased photosynthetic efficiency. Much of the recent information is derived from inspection of patent databases, a useful source of information on commercial priorities. The review also discusses the time scale for the introduction of these transgenes into breeding populations and their eventual release as new varieties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundamental nutrition seeks to describe the complex biochemical reactions involved in assimilation and processing of nutrients by various tissues and organs, and to quantify nutrient movement (flux) through those processes. Over the last 25 yr, considerable progress has been made in increasing our understanding of metabolism in dairy cattle. Major advances have been made at all levels of biological organization, including the whole animal, organ systems, tissues, cells, and molecules. At the whole-animal level, progress has been made in delineating metabolism during late pregnancy and the transition to lactation, as well as in whole-body use of energy-yielding substrates and amino acids for growth in young calves. An explosion of research using multicatheterization techniques has led to better quantitative descriptions of nutrient use by tissues of the portal-drained viscera (digestive tract, pancreas, and associated adipose tissues) and liver. Isolated tissue preparations have provided important information on the interrelationships among glucose, fatty acid, and amino acid metabolism in liver, adipose tissue, and mammary gland, as well as the regulation of these pathways during different physiological states. Finally, the last 25 yr has witnessed the birth of "molecular biology" approaches to understanding fundamental nutrition. Although measurements of mRNA abundance for proteins of interest already have provided new insights into regulation of metabolism, the next 25 yr will likely see remarkable advances as these techniques continue to be applied to problems of dairy cattle biology. Integration of the "omics" technologies (functional genomics, proteomics, and metabolomics) with measurements of tissue metabolism obtained by other methods is a particularly exciting prospect for the future. The result should be improved animal health and well being, more efficient dairy production, and better models to predict nutritional requirements and provide rations to meet those requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous attempts to apply statistical models, which correlate nutrient intake with methane production, have been of limited. value where predictions are obtained for nutrient intakes and diet types outside those. used in model construction. Dynamic mechanistic models have proved more suitable for extrapolation, but they remain computationally expensive and are not applied easily in practical situations. The first objective of this research focused on employing conventional techniques to generate statistical models of methane production appropriate to United Kingdom dairy systems. The second objective was to evaluate these models and a model published previously using both United Kingdom and North American data sets. Thirdly, nonlinear models were considered as alternatives to the conventional linear regressions. The United Kingdom calorimetry data used to construct the linear models also were used to develop the three. nonlinear alternatives that were ball of modified Mitscherlich (monomolecular) form. Of the linear models tested,, an equation from the literature proved most reliable across the full range of evaluation data (root mean square prediction error = 21.3%). However, the Mitscherlich models demonstrated the greatest degree of adaptability across diet types and intake level. The most successful model for simulating the independent data was a modified Mitscherlich equation with the steepness parameter set to represent dietary starch-to-ADF ratio (root mean square prediction error = 20.6%). However, when such data were unavailable, simpler Mitscherlich forms relating dry matter or metabolizable energy intake to methane production remained better alternatives relative to their linear counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feed samples received by commercial analytical laboratories are often undefined or mixed varieties of forages, originate from various agronomic or geographical areas of the world, are mixtures (e.g., total mixed rations) and are often described incompletely or not at all. Six unified single equation approaches to predict the metabolizable energy (ME) value of feeds determined in sheep fed at maintenance ME intake were evaluated utilizing 78 individual feeds representing 17 different forages, grains, protein meals and by-product feedstuffs. The predictive approaches evaluated were two each from National Research Council [National Research Council (NRC), Nutrient Requirements of Dairy Cattle, seventh revised ed. National Academy Press, Washington, DC, USA, 2001], University of California at Davis (UC Davis) and ADAS (Stratford, UK). Slopes and intercepts for the two ADAS approaches that utilized in vitro digestibility of organic matter and either measured gross energy (GE), or a prediction of GE from component assays, and one UC Davis approach, based upon in vitro gas production and some component assays, differed from both unity and zero, respectively, while this was not the case for the two NRC and one UC Davis approach. However, within these latter three approaches, the goodness of fit (r(2)) increased from the NRC approach utilizing lignin (0.61) to the NRC approach utilizing 48 h in vitro digestion of neutral detergent fibre (NDF:0.72) and to the UC Davis approach utilizing a 30 h in vitro digestion of NDF (0.84). The reason for the difference between the precision of the NRC procedures was the failure of assayed lignin values to accurately predict 48 h in vitro digestion of NDF. However, differences among the six predictive approaches in the number of supporting assays, and their costs, as well as that the NRC approach is actually three related equations requiring categorical description of feeds (making them unsuitable for mixed feeds) while the ADAS and UC Davis approaches are single equations, suggests that the procedure of choice will vary dependent Upon local conditions, specific objectives and the feedstuffs to be evaluated. In contrast to the evaluation of the procedures among feedstuffs, no procedure was able to consistently discriminate the ME values of individual feeds within feedstuffs determined in vivo, suggesting that the quest for an accurate and precise ME predictive approach among and within feeds, may remain to be identified. (C) 2004 Elsevier B.V. All rights reserved.