51 resultados para mathematical modeling of PTO
em Université de Lausanne, Switzerland
Resumo:
Despite their limited proliferation capacity, regulatory T cells (T(regs)) constitute a population maintained over the entire lifetime of a human organism. The means by which T(regs) sustain a stable pool in vivo are controversial. Using a mathematical model, we address this issue by evaluating several biological scenarios of the origins and the proliferation capacity of two subsets of T(regs): precursor CD4(+)CD25(+)CD45RO(-) and mature CD4(+)CD25(+)CD45RO(+) cells. The lifelong dynamics of T(regs) are described by a set of ordinary differential equations, driven by a stochastic process representing the major immune reactions involving these cells. The model dynamics are validated using data from human donors of different ages. Analysis of the data led to the identification of two properties of the dynamics: (1) the equilibrium in the CD4(+)CD25(+)FoxP3(+)T(regs) population is maintained over both precursor and mature T(regs) pools together, and (2) the ratio between precursor and mature T(regs) is inverted in the early years of adulthood. Then, using the model, we identified three biologically relevant scenarios that have the above properties: (1) the unique source of mature T(regs) is the antigen-driven differentiation of precursors that acquire the mature profile in the periphery and the proliferation of T(regs) is essential for the development and the maintenance of the pool; there exist other sources of mature T(regs), such as (2) a homeostatic density-dependent regulation or (3) thymus- or effector-derived T(regs), and in both cases, antigen-induced proliferation is not necessary for the development of a stable pool of T(regs). This is the first time that a mathematical model built to describe the in vivo dynamics of regulatory T cells is validated using human data. The application of this model provides an invaluable tool in estimating the amount of regulatory T cells as a function of time in the blood of patients that received a solid organ transplant or are suffering from an autoimmune disease.
Resumo:
Recent years have seen a surge in mathematical modeling of the various aspects of neuron-astrocyte interactions, and the field of brain energy metabolism is no exception in that regard. Despite the advent of biophysical models in the field, the long-lasting debate on the role of lactate in brain energy metabolism is still unresolved. Quite the contrary, it has been ported to the world of differential equations. Here, we summarize the present state of this discussion from the modeler's point of view and bring some crucial points to the attention of the non-mathematically proficient reader.
Resumo:
A human in vivo toxicokinetic model was built to allow a better understanding of the toxicokinetics of folpet fungicide and its key ring biomarkers of exposure: phthalimide (PI), phthalamic acid (PAA) and phthalic acid (PA). Both PI and the sum of ring metabolites, expressed as PA equivalents (PAeq), may be used as biomarkers of exposure. The conceptual representation of the model was based on the analysis of the time course of these biomarkers in volunteers orally and dermally exposed to folpet. In the model, compartments were also used to represent the body burden of folpet and experimentally relevant PI, PAA and PA ring metabolites in blood and in key tissues as well as in excreta, hence urinary and feces. The time evolution of these biomarkers in each compartment of the model was then mathematically described by a system of coupled differential equations. The mathematical parameters of the model were then determined from best fits to the time courses of PI and PAeq in blood and urine of five volunteers administered orally 1 mg kg(-1) and dermally 10 mg kg(-1) of folpet. In the case of oral administration, the mean elimination half-life of PI from blood (through feces, urine or metabolism) was found to be 39.9 h as compared with 28.0 h for PAeq. In the case of a dermal application, mean elimination half-life of PI and PAeq was estimated to be 34.3 and 29.3 h, respectively. The average final fractions of administered dose recovered in urine as PI over the 0-96 h period were 0.030 and 0.002%, for oral and dermal exposure, respectively. Corresponding values for PAeq were 24.5 and 1.83%, respectively. Finally, the average clearance rate of PI from blood calculated from the oral and dermal data was 0.09 ± 0.03 and 0.13 ± 0.05 ml h(-1) while the volume of distribution was 4.30 ± 1.12 and 6.05 ± 2.22 l, respectively. It was not possible to obtain the corresponding values from PAeq data owing to the lack of blood time course data.
Resumo:
Advancements in high-throughput technologies to measure increasingly complex biological phenomena at the genomic level are rapidly changing the face of biological research from the single-gene single-protein experimental approach to studying the behavior of a gene in the context of the entire genome (and proteome). This shift in research methodologies has resulted in a new field of network biology that deals with modeling cellular behavior in terms of network structures such as signaling pathways and gene regulatory networks. In these networks, different biological entities such as genes, proteins, and metabolites interact with each other, giving rise to a dynamical system. Even though there exists a mature field of dynamical systems theory to model such network structures, some technical challenges are unique to biology such as the inability to measure precise kinetic information on gene-gene or gene-protein interactions and the need to model increasingly large networks comprising thousands of nodes. These challenges have renewed interest in developing new computational techniques for modeling complex biological systems. This chapter presents a modeling framework based on Boolean algebra and finite-state machines that are reminiscent of the approach used for digital circuit synthesis and simulation in the field of very-large-scale integration (VLSI). The proposed formalism enables a common mathematical framework to develop computational techniques for modeling different aspects of the regulatory networks such as steady-state behavior, stochasticity, and gene perturbation experiments.
Resumo:
The Mont Collon mafic complex is one of the best preserved examples of the Early Permian magmatism in the Central Alps, related to the intra-continental collapse of the Variscan belt. It mostly consists (> 95 vol.%) of ol+hy-nonnative plagioclase-wehrlites, olivine- and cpx-gabbros with cumulitic structures, crosscut by acid dikes. Pegmatitic gabbros, troctolites and anorthosites outcrop locally. A well-preserved cumulative, sequence is exposed in the Dents de Bertol area (center of intrusion). PT-calculations indicate that this layered magma chamber emplaced at mid-crustal levels at about 0.5 GPa and 1100 degrees C. The Mont Collon cumulitic rocks record little magmatic differentiation, as illustrated by the restricted range of clinopyroxene mg-number (Mg#(cpx)=83-89). Whole-rock incompatible trace-element contents (e.g. Nb, Zr, Ba) vary largely and without correlation with major-element composition. These features are characteristic of an in-situ crystallization process with variable amounts of interstitial liquid L trapped between the cumulus mineral phases. LA-ICPMS measurements show that trace-element distribution in the latter is homogeneous, pointing to subsolidus re-equilibration between crystals and interstitial melts. A quantitative modeling based on Langmuir's in-situ crystallization equation successfully duplicated the REE concentrations in cumulitic minerals of all rock facies of the intrusion. The calculated amounts of interstitial liquid L vary between 0 and 35% for degrees of differentiation F of 0 to 20%, relative to the least evolved facies of the intrusion. L values are well correlated with the modal proportions of interstitial amphibole and whole-rock incompatible trace-element concentrations (e.g. Zr, Nb) of the tested samples. However, the in-situ crystallization model reaches its limitations with rock containing high modal content of REE-bearing minerals (i.e. zircon), such as pegmatitic gabbros. Dikes of anorthositic composition, locally crosscutting the layered lithologies, evidence that the Mont Collon rocks evolved in open system with mixing of intercumulus liquids of different origins and possibly contrasting compositions. The proposed model is not able to resolve these complex open systems, but migrating liquids could be partly responsible for the observed dispersion of points in some correlation diagrams. Absence of significant differentiation with recurrent lithologies in the cumulitic pile of Dents de Bertol points to an efficiently convective magma chamber, with possible periodic replenishment, (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Computational modeling has become a widely used tool for unraveling the mechanisms of higher level cooperative cell behavior during vascular morphogenesis. However, experimenting with published simulation models or adding new assumptions to those models can be daunting for novice and even for experienced computational scientists. Here, we present a step-by-step, practical tutorial for building cell-based simulations of vascular morphogenesis using the Tissue Simulation Toolkit (TST). The TST is a freely available, open-source C++ library for developing simulations with the two-dimensional cellular Potts model, a stochastic, agent-based framework to simulate collective cell behavior. We will show the basic use of the TST to simulate and experiment with published simulations of vascular network formation. Then, we will present step-by-step instructions and explanations for building a recent simulation model of tumor angiogenesis. Demonstrated mechanisms include cell-cell adhesion, chemotaxis, cell elongation, haptotaxis, and haptokinesis.
Multimodel inference and multimodel averaging in empirical modeling of occupational exposure levels.
Resumo:
Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.
Resumo:
The purpose of this study was to develop a two-compartment metabolic model of brain metabolism to assess oxidative metabolism from [1-(11)C] acetate radiotracer experiments, using an approach previously applied in (13)C magnetic resonance spectroscopy (MRS), and compared with an one-tissue compartment model previously used in brain [1-(11)C] acetate studies. Compared with (13)C MRS studies, (11)C radiotracer measurements provide a single uptake curve representing the sum of all labeled metabolites, without chemical differentiation, but with higher temporal resolution. The reliability of the adjusted metabolic fluxes was analyzed with Monte-Carlo simulations using synthetic (11)C uptake curves, based on a typical arterial input function and previously published values of the neuroglial fluxes V(tca)(g), V(x), V(nt), and V(tca)(n) measured in dynamic (13)C MRS experiments. Assuming V(x)(g)=10 × V(tca)(g) and V(x)(n)=V(tca)(n), it was possible to assess the composite glial tricarboxylic acid (TCA) cycle flux V(gt)(g) (V(gt)(g)=V(x)(g) × V(tca)(g)/(V(x)(g)+V(tca)(g))) and the neurotransmission flux V(nt) from (11)C tissue-activity curves obtained within 30 minutes in the rat cortex with a beta-probe after a bolus infusion of [1-(11)C] acetate (n=9), resulting in V(gt)(g)=0.136±0.042 and V(nt)=0.170±0.103 μmol/g per minute (mean±s.d. of the group), in good agreement with (13)C MRS measurements.
Resumo:
MOTIVATION: In silico modeling of gene regulatory networks has gained some momentum recently due to increased interest in analyzing the dynamics of biological systems. This has been further facilitated by the increasing availability of experimental data on gene-gene, protein-protein and gene-protein interactions. The two dynamical properties that are often experimentally testable are perturbations and stable steady states. Although a lot of work has been done on the identification of steady states, not much work has been reported on in silico modeling of cellular differentiation processes. RESULTS: In this manuscript, we provide algorithms based on reduced ordered binary decision diagrams (ROBDDs) for Boolean modeling of gene regulatory networks. Algorithms for synchronous and asynchronous transition models have been proposed and their corresponding computational properties have been analyzed. These algorithms allow users to compute cyclic attractors of large networks that are currently not feasible using existing software. Hereby we provide a framework to analyze the effect of multiple gene perturbation protocols, and their effect on cell differentiation processes. These algorithms were validated on the T-helper model showing the correct steady state identification and Th1-Th2 cellular differentiation process. AVAILABILITY: The software binaries for Windows and Linux platforms can be downloaded from http://si2.epfl.ch/~garg/genysis.html.
Resumo:
Retroelements are important evolutionary forces but can be deleterious if left uncontrolled. Members of the human APOBEC3 family of cytidine deaminases can inhibit a wide range of endogenous, as well as exogenous, retroelements. These enzymes are structurally organized in one or two domains comprising a zinc-coordinating motif. APOBEC3G contains two such domains, only the C terminal of which is endowed with editing activity, while its N-terminal counterpart binds RNA, promotes homo-oligomerization, and is necessary for packaging into human immunodeficiency virus type 1 (HIV-1) virions. Here, we performed a large-scale mutagenesis-based analysis of the APOBEC3G N terminus, testing mutants for (i) inhibition of vif-defective HIV-1 infection and Alu retrotransposition, (ii) RNA binding, and (iii) oligomerization. Furthermore, in the absence of structural information on this domain, we used homology modeling to examine the positions of functionally important residues and of residues found to be under positive selection by phylogenetic analyses of primate APOBEC3G genes. Our results reveal the importance of a predicted RNA binding dimerization interface both for packaging into HIV-1 virions and inhibition of both HIV-1 infection and Alu transposition. We further found that the HIV-1-blocking activity of APOBEC3G N-terminal mutants defective for packaging can be almost entirely rescued if their virion incorporation is forced by fusion with Vpr, indicating that the corresponding region of APOBEC3G plays little role in other aspects of its action against this pathogen. Interestingly, residues forming the APOBEC3G dimer interface are highly conserved, contrasting with the rapid evolution of two neighboring surface-exposed amino acid patches, one targeted by the Vif protein of primate lentiviruses and the other of yet-undefined function.