995 resultados para Van Schaack, Peter, 1747-1832.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The gut microbiota enhances the host's metabolic capacity for processing nutrients and drugs and modulate the activities of multiple pathways in a variety of organ systems. We have probed the systemic metabolic adaptation to gut colonization for 20 days following exposure of axenic mice (n = 35) to a typical environmental microbial background using high-resolution (1)H nuclear magnetic resonance (NMR) spectroscopy to analyze urine, plasma, liver, kidney, and colon (5 time points) metabolic profiles. Acquisition of the gut microbiota was associated with rapid increase in body weight (4%) over the first 5 days of colonization with parallel changes in multiple pathways in all compartments analyzed. The colonization process stimulated glycogenesis in the liver prior to triggering increases in hepatic triglyceride synthesis. These changes were associated with modifications of hepatic Cyp8b1 expression and the subsequent alteration of bile acid metabolites, including taurocholate and tauromuricholate, which are essential regulators of lipid absorption. Expression and activity of major drug-metabolizing enzymes (Cyp3a11 and Cyp2c29) were also significantly stimulated. Remarkably, statistical modeling of the interactions between hepatic metabolic profiles and microbial composition analyzed by 16S rRNA gene pyrosequencing revealed strong associations of the Coriobacteriaceae family with both the hepatic triglyceride, glucose, and glycogen levels and the metabolism of xenobiotics. These data demonstrate the importance of microbial activity in metabolic phenotype development, indicating that microbiota manipulation is a useful tool for beneficially modulating xenobiotic metabolism and pharmacokinetics in personalized health care. IMPORTANCE: Gut bacteria have been associated with various essential biological functions in humans such as energy harvest and regulation of blood pressure. Furthermore, gut microbial colonization occurs after birth in parallel with other critical processes such as immune and cognitive development. Thus, it is essential to understand the bidirectional interaction between the host metabolism and its symbionts. Here, we describe the first evidence of an in vivo association between a family of bacteria and hepatic lipid metabolism. These results provide new insights into the fundamental mechanisms that regulate host-gut microbiota interactions and are thus of wide interest to microbiological, nutrition, metabolic, systems biology, and pharmaceutical research communities. This work will also contribute to developing novel strategies in the alteration of host-gut microbiota relationships which can in turn beneficially modulate the host metabolism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT Non-Gaussian/non-linear data assimilation is becoming an increasingly important area of research in the Geosciences as the resolution and non-linearity of models are increased and more and more non-linear observation operators are being used. In this study, we look at the effect of relaxing the assumption of a Gaussian prior on the impact of observations within the data assimilation system. Three different measures of observation impact are studied: the sensitivity of the posterior mean to the observations, mutual information and relative entropy. The sensitivity of the posterior mean is derived analytically when the prior is modelled by a simplified Gaussian mixture and the observation errors are Gaussian. It is found that the sensitivity is a strong function of the value of the observation and proportional to the posterior variance. Similarly, relative entropy is found to be a strong function of the value of the observation. However, the errors in estimating these two measures using a Gaussian approximation to the prior can differ significantly. This hampers conclusions about the effect of the non-Gaussian prior on observation impact. Mutual information does not depend on the value of the observation and is seen to be close to its Gaussian approximation. These findings are illustrated with the particle filter applied to the Lorenz ’63 system. This article is concluded with a discussion of the appropriateness of these measures of observation impact for different situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Proper interactions between the intestinal mucosa, gut microbiota and nutrient flow are required to establish homoeostasis of the host. Since the proximal part of the small intestine is the first region where these interactions occur, and since most of the nutrient absorption occurs in the jejunum, it is important to understand the dynamics of metabolic responses of the mucosa in this intestinal region.Design: Germ-free mice aged 8-10 weeks were conventionalised with faecal microbiota, and responses of the jejunal mucosa to bacterial colonisation were followed over a 30-day time course. Combined transcriptome, histology, (1)H NMR metabonomics and microbiota phylogenetic profiling analyses were used.Results: The jejunal mucosa showed a two-phase response to the colonising microbiota. The acute-phase response, which had already started 1 day after conventionalisation, involved repression of the cell cycle and parts of the basal metabolism. The secondary-phase response, which was consolidated during conventionalisation (days 4-30), was characterised by a metabolic shift from an oxidative energy supply to anabolic metabolism, as inferred from the tissue transcriptome and metabonome changes. Detailed transcriptome analysis identified tissue transcriptional signatures for the dynamic control of the metabolic reorientation in the jejunum. The molecular components identified in the response signatures have known roles in human metabolic disorders, including insulin sensitivity and type 2 diabetes mellitus.Conclusion: This study elucidates the dynamic jejunal response to the microbiota and supports a prominent role for the jejunum in metabolic control, including glucose and energy homoeostasis. The molecular signatures of this process may help to find risk markers in the declining insulin sensitivity seen in human type 2 diabetes mellitus, for instance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the error dynamics for cycled data assimilation systems, such that the inverse problem of state determination is solved at tk, k = 1, 2, 3, ..., with a first guess given by the state propagated via a dynamical system model from time tk − 1 to time tk. In particular, for nonlinear dynamical systems that are Lipschitz continuous with respect to their initial states, we provide deterministic estimates for the development of the error ||ek|| := ||x(a)k − x(t)k|| between the estimated state x(a) and the true state x(t) over time. Clearly, observation error of size δ > 0 leads to an estimation error in every assimilation step. These errors can accumulate, if they are not (a) controlled in the reconstruction and (b) damped by the dynamical system under consideration. A data assimilation method is called stable, if the error in the estimate is bounded in time by some constant C. The key task of this work is to provide estimates for the error ||ek||, depending on the size δ of the observation error, the reconstruction operator Rα, the observation operator H and the Lipschitz constants K(1) and K(2) on the lower and higher modes of controlling the damping behaviour of the dynamics. We show that systems can be stabilized by choosing α sufficiently small, but the bound C will then depend on the data error δ in the form c||Rα||δ with some constant c. Since ||Rα|| → ∞ for α → 0, the constant might be large. Numerical examples for this behaviour in the nonlinear case are provided using a (low-dimensional) Lorenz '63 system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interplay between dietary nutrients, gut microbiota and mammalian host tissues of the gastrointestinal tract is recognised as highly relevant for host health. Combined transcriptome, metabonome and microbial profiling tools were employed to analyse the dynamic responses of germfree mouse colonic mucosa to colonisation by normal mouse microbiota (conventionalisation) at different time-points during 16 days. The colonising microbiota showed a shift from early (days 1 and 2) to later colonisers (days 8 and 16). The dynamic changes in the microbial community were rapidly reflected by the urine metabolic profiles (day 1) and at later stages (day 4 onward) by the colon mucosa transcriptome and metabolic profiles. Correlations of host transcriptomes, metabolite patterns and microbiota composition revealed associations between Bacilli and Proteobacteria, and differential expression of host genes involved in energy and anabolic metabolism. Differential gene expression correlated with scyllo- and myo-inositol, glutamine, glycine and alanine levels in colonic tissues during the time span of conventionalisation. Our combined time-resolved analyses may help to expand the understanding of host-microbe molecular interactions during the microbial establishment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A potential problem with Ensemble Kalman Filter is the implicit Gaussian assumption at analysis times. Here we explore the performance of a recently proposed fully nonlinear particle filter on a high-dimensional but simplified ocean model, in which the Gaussian assumption is not made. The model simulates the evolution of the vorticity field in time, described by the barotropic vorticity equation, in a highly nonlinear flow regime. While common knowledge is that particle filters are inefficient and need large numbers of model runs to avoid degeneracy, the newly developed particle filter needs only of the order of 10-100 particles on large scale problems. The crucial new ingredient is that the proposal density cannot only be used to ensure all particles end up in high-probability regions of state space as defined by the observations, but also to ensure that most of the particles have similar weights. Using identical twin experiments we found that the ensemble mean follows the truth reliably, and the difference from the truth is captured by the ensemble spread. A rank histogram is used to show that the truth run is indistinguishable from any of the particles, showing statistical consistency of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation methods which avoid the assumption of Gaussian error statistics are being developed for geoscience applications. We investigate how the relaxation of the Gaussian assumption affects the impact observations have within the assimilation process. The effect of non-Gaussian observation error (described by the likelihood) is compared to previously published work studying the effect of a non-Gaussian prior. The observation impact is measured in three ways: the sensitivity of the analysis to the observations, the mutual information, and the relative entropy. These three measures have all been studied in the case of Gaussian data assimilation and, in this case, have a known analytical form. It is shown that the analysis sensitivity can also be derived analytically when at least one of the prior or likelihood is Gaussian. This derivation shows an interesting asymmetry in the relationship between analysis sensitivity and analysis error covariance when the two different sources of non-Gaussian structure are considered (likelihood vs. prior). This is illustrated for a simple scalar case and used to infer the effect of the non-Gaussian structure on mutual information and relative entropy, which are more natural choices of metric in non-Gaussian data assimilation. It is concluded that approximating non-Gaussian error distributions as Gaussian can give significantly erroneous estimates of observation impact. The degree of the error depends not only on the nature of the non-Gaussian structure, but also on the metric used to measure the observation impact and the source of the non-Gaussian structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The analysis step of the (ensemble) Kalman filter is optimal when (1) the distribution of the background is Gaussian, (2) state variables and observations are related via a linear operator, and (3) the observational error is of additive nature and has Gaussian distribution. When these conditions are largely violated, a pre-processing step known as Gaussian anamorphosis (GA) can be applied. The objective of this procedure is to obtain state variables and observations that better fulfil the Gaussianity conditions in some sense. In this work we analyse GA from a joint perspective, paying attention to the effects of transformations in the joint state variable/observation space. First, we study transformations for state variables and observations that are independent from each other. Then, we introduce a targeted joint transformation with the objective to obtain joint Gaussianity in the transformed space. We focus primarily in the univariate case, and briefly comment on the multivariate one. A key point of this paper is that, when (1)-(3) are violated, using the analysis step of the EnKF will not recover the exact posterior density in spite of any transformations one may perform. These transformations, however, provide approximations of different quality to the Bayesian solution of the problem. Using an example in which the Bayesian posterior can be analytically computed, we assess the quality of the analysis distributions generated after applying the EnKF analysis step in conjunction with different GA options. The value of the targeted joint transformation is particularly clear for the case when the prior is Gaussian, the marginal density for the observations is close to Gaussian, and the likelihood is a Gaussian mixture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current European Union regulatory risk assessment allows application of pesticides provided that recovery of nontarget arthropods in-crop occurs within a year. Despite the long-established theory of source-sink dynamics, risk assessment ignores depletion of surrounding populations and typical field trials are restricted to plot-scale experiments. In the present study, the authors used agent-based modeling of 2 contrasting invertebrates, a spider and a beetle, to assess how the area of pesticide application and environmental half-life affect the assessment of recovery at the plot scale and impact the population at the landscape scale. Small-scale plot experiments were simulated for pesticides with different application rates and environmental half-lives. The same pesticides were then evaluated at the landscape scale (10 km × 10 km) assuming continuous year-on-year usage. The authors' results show that recovery time estimated from plot experiments is a poor indicator of long-term population impact at the landscape level and that the spatial scale of pesticide application strongly determines population-level impact. This raises serious doubts as to the utility of plot-recovery experiments in pesticide regulatory risk assessment for population-level protection. Predictions from the model are supported by empirical evidence from a series of studies carried out in the decade starting in 1988. The issues raised then can now be addressed using simulation. Prediction of impacts at landscape scales should be more widely used in assessing the risks posed by environmental stressors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article shows how one can formulate the representation problem starting from Bayes’ theorem. The purpose of this article is to raise awareness of the formal solutions,so that approximations can be placed in a proper context. The representation errors appear in the likelihood, and the different possibilities for the representation of reality in model and observations are discussed, including nonlinear representation probability density functions. Specifically, the assumptions needed in the usual procedure to add a representation error covariance to the error covariance of the observations are discussed,and it is shown that, when several sub-grid observations are present, their mean still has a representation error ; socalled ‘superobbing’ does not resolve the issue. Connection is made to the off-line or on-line retrieval problem, providing a new simple proof of the equivalence of assimilating linear retrievals and original observations. Furthermore, it is shown how nonlinear retrievals can be assimilated without loss of information. Finally we discuss how errors in the observation operator model can be treated consistently in the Bayesian framework, connecting to previous work in this area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We systematically compare the performance of ETKF-4DVAR, 4DVAR-BEN and 4DENVAR with respect to two traditional methods (4DVAR and ETKF) and an ensemble transform Kalman smoother (ETKS) on the Lorenz 1963 model. We specifically investigated this performance with increasing nonlinearity and using a quasi-static variational assimilation algorithm as a comparison. Using the analysis root mean square error (RMSE) as a metric, these methods have been compared considering (1) assimilation window length and observation interval size and (2) ensemble size to investigate the influence of hybrid background error covariance matrices and nonlinearity on the performance of the methods. For short assimilation windows with close to linear dynamics, it has been shown that all hybrid methods show an improvement in RMSE compared to the traditional methods. For long assimilation window lengths in which nonlinear dynamics are substantial, the variational framework can have diffculties fnding the global minimum of the cost function, so we explore a quasi-static variational assimilation (QSVA) framework. Of the hybrid methods, it is seen that under certain parameters, hybrid methods which do not use a climatological background error covariance do not need QSVA to perform accurately. Generally, results show that the ETKS and hybrid methods that do not use a climatological background error covariance matrix with QSVA outperform all other methods due to the full flow dependency of the background error covariance matrix which also allows for the most nonlinearity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate and reliable rain rate estimates are important for various hydrometeorological applications. Consequently, rain sensors of different types have been deployed in many regions. In this work, measurements from different instruments, namely, rain gauge, weather radar, and microwave link, are combined for the first time to estimate with greater accuracy the spatial distribution and intensity of rainfall. The objective is to retrieve the rain rate that is consistent with all these measurements while incorporating the uncertainty associated with the different sources of information. Assuming the problem is not strongly nonlinear, a variational approach is implemented and the Gauss–Newton method is used to minimize the cost function containing proper error estimates from all sensors. Furthermore, the method can be flexibly adapted to additional data sources. The proposed approach is tested using data from 14 rain gauges and 14 operational microwave links located in the Zürich area (Switzerland) to correct the prior rain rate provided by the operational radar rain product from the Swiss meteorological service (MeteoSwiss). A cross-validation approach demonstrates the improvement of rain rate estimates when assimilating rain gauge and microwave link information.