158 resultados para Stochastic processes--Computer simulation.
Resumo:
Changes in intracellular Na(+) concentration underlie essential neurobiological processes, but few reliable tools exist for their measurement. Here we characterize a new synthetic Na(+)-sensitive fluorescent dye, Asante Natrium Green (ANG), with unique properties. This indicator was excitable in the visible spectrum and by two-photon illumination, suffered little photobleaching and located to the cytosol were it remained for long durations without noticeable unwanted effects on basic cell properties. When used in brain tissue, ANG yielded a bright fluorescent signal during physiological Na(+) responses both in neurons and astrocytes. Synchronous electrophysiological and fluorometric recordings showed that ANG produced accurate Na(+) measurement in situ. This new Na(+) indicator opens innovative ways of probing neuronal circuits.
Resumo:
Human-induced habitat fragmentation constitutes a major threat to biodiversity. Both genetic and demographic factors combine to drive small and isolated populations into extinction vortices. Nevertheless, the deleterious effects of inbreeding and drift load may depend on population structure, migration patterns, and mating systems and are difficult to predict in the absence of crossing experiments. We performed stochastic individual-based simulations aimed at predicting the effects of deleterious mutations on population fitness (offspring viability and median time to extinction) under a variety of settings (landscape configurations, migration models, and mating systems) on the basis of easy-to-collect demographic and genetic information. Pooling all simulations, a large part (70%) of variance in offspring viability was explained by a combination of genetic structure (F(ST)) and within-deme heterozygosity (H(S)). A similar part of variance in median time to extinction was explained by a combination of local population size (N) and heterozygosity (H(S)). In both cases the predictive power increased above 80% when information on mating systems was available. These results provide robust predictive models to evaluate the viability prospects of fragmented populations.
Resumo:
Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.
Resumo:
How have changes in communications technology affected the way that misinformation spreads through a population and persists? To what extent do differences in the architecture of social networks affect the spread of misinformation, relative to the rates and rules by which individuals transmit or eliminate different pieces of information (cultural traits)? Here, we use analytical models and individual-based simulations to study how a 'cultural load' of misinformation can be maintained in a population under a balance between social transmission and selective elimination of cultural traits with low intrinsic value. While considerable research has explored how network architecture affects percolation processes, we find that the relative rates at which individuals transmit or eliminate traits can have much more profound impacts on the cultural load than differences in network architecture. In particular, the cultural load is insensitive to correlations between an individual's network degree and rate of elimination when these quantities vary among individuals. Taken together, these results suggest that changes in communications technology may have influenced cultural evolution more strongly through changes in the amount of information flow, rather than the details of who is connected to whom.
Resumo:
We present a novel and straightforward method for estimating recent migration rates between discrete populations using multilocus genotype data. The approach builds upon a two-step sampling design, where individual genotypes are sampled before and after dispersal. We develop a model that estimates all pairwise backwards migration rates (m(ij), the probability that an individual sampled in population i is a migrant from population j) between a set of populations. The method is validated with simulated data and compared with the methods of BayesAss and Structure. First, we use data for an island model and then we consider more realistic data simulations for a metapopulation of the greater white-toothed shrew (Crocidura russula). We show that the precision and bias of estimates primarily depend upon the proportion of individuals sampled in each population. Weak sampling designs may particularly affect the quality of the coverage provided by 95% highest posterior density intervals. We further show that it is relatively insensitive to the number of loci sampled and the overall strength of genetic structure. The method can easily be extended and makes fewer assumptions about the underlying demographic and genetic processes than currently available methods. It allows backwards migration rates to be estimated across a wide range of realistic conditions.
Resumo:
Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.
Resumo:
Mouse NK cells express MHC class I-specific inhibitory Ly49 receptors. Since these receptors display distinct ligand specificities and are clonally distributed, their expression generates a diverse NK cell receptor repertoire specific for MHC class I molecules. We have previously found that the Dd (or Dk)-specific Ly49A receptor is usually expressed from a single allele. However, a small fraction of short-term NK cell clones expressed both Ly49A alleles, suggesting that the two Ly49A alleles are independently and randomly expressed. Here we show that the genes for two additional Ly49 receptors (Ly49C and Ly49G2) are also expressed in a (predominantly) mono-allelic fashion. Since single NK cells can co-express multiple Ly49 receptors, we also investigated whether mono-allelic expression from within the tightly linked Ly49 gene cluster is coordinate or independent. Our clonal analysis suggests that the expression of alleles of distinct Ly49 genes is not coordinate. Thus Ly49 alleles are apparently independently and randomly chosen for stable expression, a process that directly restricts the number of Ly49 receptors expressed per single NK cell. We propose that the Ly49 receptor repertoire specific for MHC class I is generated by an allele-specific, stochastic gene expression process that acts on the entire Ly49 gene cluster.
Resumo:
quantiNemo is an individual-based, genetically explicit stochastic simulation program. It was developed to investigate the effects of selection, mutation, recombination and drift on quantitative traits with varying architectures in structured populations connected by migration and located in a heterogeneous habitat. quantiNemo is highly flexible at various levels: population, selection, trait(s) architecture, genetic map for QTL and/or markers, environment, demography, mating system, etc. quantiNemo is coded in C++ using an object-oriented approach and runs on any computer platform. Availability: Executables for several platforms, user's manual, and source code are freely available under the GNU General Public License at http://www2.unil.ch/popgen/softwares/quantinemo.
Resumo:
The aim of this computerized simulation model is to provide an estimate of the number of beds used by a population, taking into accounts important determining factors. These factors are demographic data of the deserved population, hospitalization rates, hospital case-mix and length of stay; these parameters can be taken either from observed data or from scenarii. As an example, the projected evolution of the number of beds in Canton Vaud for the period 1893-2010 is presented.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides presented by class I major histocompatibility complexes (MHCs) is the determining event in the specific cellular immune response against virus-infected cells or tumor cells. It is of great interest, therefore, to elucidate the molecular principles upon which the selectivity of a TCR is based. These principles can in turn be used to design therapeutic approaches, such as peptide-based immunotherapies of cancer. In this study, free energy simulation methods are used to analyze the binding free energy difference of a particular TCR (A6) for a wild-type peptide (Tax) and a mutant peptide (Tax P6A), both presented in HLA A2. The computed free energy difference is 2.9 kcal/mol, in good agreement with the experimental value. This makes possible the use of the simulation results for obtaining an understanding of the origin of the free energy difference which was not available from the experimental results. A free energy component analysis makes possible the decomposition of the free energy difference between the binding of the wild-type and mutant peptide into its components. Of particular interest is the fact that better solvation of the mutant peptide when bound to the MHC molecule is an important contribution to the greater affinity of the TCR for the latter. The results make possible identification of the residues of the TCR which are important for the selectivity. This provides an understanding of the molecular principles that govern the recognition. The possibility of using free energy simulations in designing peptide derivatives for cancer immunotherapy is briefly discussed.
Resumo:
Children who sustain a prenatal or perinatal brain injury in the form of a stroke develop remarkably normal cognitive functions in certain areas, with a particular strength in language skills. A dominant explanation for this is that brain regions from the contralesional hemisphere "take over" their functions, whereas the damaged areas and other ipsilesional regions play much less of a role. However, it is difficult to tease apart whether changes in neural activity after early brain injury are due to damage caused by the lesion or by processes related to postinjury reorganization. We sought to differentiate between these two causes by investigating the functional connectivity (FC) of brain areas during the resting state in human children with early brain injury using a computational model. We simulated a large-scale network consisting of realistic models of local brain areas coupled through anatomical connectivity information of healthy and injured participants. We then compared the resulting simulated FC values of healthy and injured participants with the empirical ones. We found that the empirical connectivity values, especially of the damaged areas, correlated better with simulated values of a healthy brain than those of an injured brain. This result indicates that the structural damage caused by an early brain injury is unlikely to have an adverse and sustained impact on the functional connections, albeit during the resting state, of damaged areas. Therefore, these areas could continue to play a role in the development of near-normal function in certain domains such as language in these children.
Resumo:
Un premier exercice avait proposé un regroupement des diagnostics pour la planification des lits. Ce regroupement avait été établi empiriquement sur une base de données provenant des hôpitaux de zone vaudois (1983-1984). Lorsqu'il s'est agi d'appliquer cette grille au Centre Hospitalier Universitaire Vaudois (CHUV), il est rapidement apparu que la structure de la clientèle d'un tel hôpital rendait indispensable le remaniement de la grille descriptive. C'est l'objet du présent cahier... [Auteurs]
Resumo:
Limited dispersal may favor the evolution of helping behaviors between relatives as it increases their relatedness, and it may inhibit such evolution as it increases local competition between these relatives. Here, we explore one way out of this dilemma: if the helping behavior allows groups to expand in size, then the kin-competition pressure opposing its evolution can be greatly reduced. We explore the effects of two kinds of stochasticity allowing for such deme expansion. First, we study the evolution of helping under environmental stochasticity that may induce complete patch extinction. Helping evolves if it results in a decrease in the probability of extinction or if it enhances the rate of patch recolonization through propagules formed by fission of nonextinct groups. This mode of dispersal is indeed commonly found in social species. Second, we consider the evolution of helping in the presence of demographic stochasticity. When fecundity is below its value maximizing deme size (undersaturation), helping evolves, but under stringent conditions unless positive density dependence (Allee effect) interferes with demographic stochasticity. When fecundity is above its value maximizing deme size (oversaturation), helping may also evolve, but only if it reduces negative density-dependent competition.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.