132 resultados para Terminals (Transportation) Computer simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Protein-ligand docking has made important progress during the last decade and has become a powerful tool for drug development, opening the way to virtual high throughput screening and in silico structure-based ligand design. Despite the flattering picture that has been drawn, recent publications have shown that the docking problem is far from being solved, and that more developments are still needed to achieve high successful prediction rates and accuracy. Introducing an accurate description of the solvation effect upon binding is thought to be essential to achieve this goal. In particular, EADock uses the Generalized Born Molecular Volume 2 (GBMV2) solvent model, which has been shown to reproduce accurately the desolvation energies calculated by solving the Poisson equation. Here, the implementation of the Fast Analytical Continuum Treatment of Solvation (FACTS) as an implicit solvation model in small molecules docking calculations has been assessed using the EADock docking program. Our results strongly support the use of FACTS for docking. The success rates of EADock/FACTS and EADock/GBMV2 are similar, i.e. around 75% for local docking and 65% for blind docking. However, these results come at a much lower computational cost: FACTS is 10 times faster than GBMV2 in calculating the total electrostatic energy, and allows a speed up of EADock by a factor of 4. This study also supports the EADock development strategy relying on the CHARMM package for energy calculations, which enables straightforward implementation and testing of the latest developments in the field of Molecular Modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To test whether quantitative traits are under directional or homogenizing selection, it is common practice to compare population differentiation estimates at molecular markers (F(ST)) and quantitative traits (Q(ST)). If the trait is neutral and its determinism is additive, then theory predicts that Q(ST) = F(ST), while Q(ST) > F(ST) is predicted under directional selection for different local optima, and Q(ST) < F(ST) is predicted under homogenizing selection. However, nonadditive effects can alter these predictions. Here, we investigate the influence of dominance on the relation between Q(ST) and F(ST) for neutral traits. Using analytical results and computer simulations, we show that dominance generally deflates Q(ST) relative to F(ST). Under inbreeding, the effect of dominance vanishes, and we show that for selfing species, a better estimate of Q(ST) is obtained from selfed families than from half-sib families. We also compare several sampling designs and find that it is always best to sample many populations (>20) with few families (five) rather than few populations with many families. Provided that estimates of Q(ST) are derived from individuals originating from many populations, we conclude that the pattern Q(ST) > F(ST), and hence the inference of directional selection for different local optima, is robust to the effect of nonadditive gene actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The increasing use of erythropoietins with long half-lives and the tendency to lengthen the administration interval to monthly injections call for raising awareness on the pharmacokinetics and risks of new erythropoietin stimulating agents (ESA). Their pharmacodynamic complexity and individual variability limit the possibility of attaining comprehensive clinical experience. In order to help physicians acquiring prescription abilities, we have built a prescription computer model to be used both as a simulator and education tool. METHODS: The pharmacokinetic computer model was developed using Visual Basic on Excel and tested with 3 different ESA half-lives (24, 48 and 138 hours) and 2 administration intervals (weekly vs. monthly). Two groups of 25 nephrologists were exposed to the six randomised combinations of half-life and administration interval. They were asked to achieve and maintain, as precisely as possible, the haemoglobin target of 11-12 g/dL in a simulated naïve patient. Each simulation was repeated twice, with or without randomly generated bleeding episodes. RESULTS: The simulation using an ESA with a half-life of 138 hours, administered monthly, compared to the other combinations of half-lives and administration intervals, showed an overshooting tendency (percentages of Hb values > 13 g/dL 15.8 ± 18.3 vs. 6.9 ± 12.2; P < 0.01), which was quickly corrected with experience. The prescription ability appeared to be optimal with a 24 hour half-life and weekly administration (ability score indexing values in the target 1.52 ± 0.70 vs. 1.24 ± 0.37; P < 0.05). The monthly prescription interval, as suggested in the literature, was accompanied by less therapeutic adjustments (4.9 ± 2.2 vs. 8.2 ± 4.9; P < 0.001); a direct correlation between haemoglobin variability and number of therapy modifications was found (P < 0.01). CONCLUSIONS: Computer-based simulations can be a useful tool for improving ESA prescription abilities among nephrologists by raising awareness about the pharmacokinetic characteristics of the various ESAs and recognizing the factors that influence haemoglobin variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a quantitative approach was used to investigate the role of D142, which belongs to the highly conserved E/DRY sequence, in the activation process of the alpha1B-adrenergic receptor (alpha1B-AR). Experimental and computer-simulated mutagenesis were performed by substituting all possible natural amino acids at the D142 site. The resulting congeneric set of proteins together with the finding that all the receptor mutants show various levels of constitutive (agonist-independent) activity enabled us to quantitatively analyze the relationships between structural/dynamic features and the extent of constitutive activity. Our results suggest that the hydrophobic/hydrophilic character of D142, which could be regulated by protonation/deprotonation of this residue, is an important modulator of the transition between the inactive (R) and active (R*) state of the alpha1B-AR. Our study represents an example of quantitative structure-activity relationship analysis of the activation process of a G protein-coupled receptor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The configuration space available to randomly cyclized polymers is divided into subspaces accessible to individual knot types. A phantom chain utilized in numerical simulations of polymers can explore all subspaces, whereas a real closed chain forming a figure-of-eight knot, for example, is confined to a subspace corresponding to this knot type only. One can conceptually compare the assembly of configuration spaces of various knot types to a complex foam where individual cells delimit the configuration space available to a given knot type. Neighboring cells in the foam harbor knots that can be converted into each other by just one intersegmental passage. Such a segment-segment passage occurring at the level of knotted configurations corresponds to a passage through the interface between neighboring cells in the foamy knot space. Using a DNA topoisomerase-inspired simulation approach we characterize here the effective interface area between neighboring knot spaces as well as the surface-to-volume ratio of individual knot spaces. These results provide a reference system required for better understanding mechanisms of action of various DNA topoisomerases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Neo-Darwinism, variation and natural selection are the two evolutionary mechanisms which propel biological evolution. Our previous article presented a histogram model [1] consisting in populations of individuals whose number changed under the influence of variation and/or fitness, the total population remaining constant. Individuals are classified into bins, and the content of each bin is calculated generation after generation by an Excel spreadsheet. Here, we apply the histogram model to a stable population with fitness F(1)=1.00 in which one or two fitter mutants emerge. In a first scenario, a single mutant emerged in the population whose fitness was greater than 1.00. The simulations ended when the original population was reduced to a single individual. The histogram model was validated by excellent agreement between its predictions and those of a classical continuous function (Eqn. 1) which predicts the number of generations needed for a favorable mutation to spread throughout a population. But in contrast to Eqn. 1, our histogram model is adaptable to more complex scenarios, as demonstrated here. In the second and third scenarios, the original population was present at time zero together with two mutants which differed from the original population by two higher and distinct fitness values. In the fourth scenario, the large original population was present at time zero together with one fitter mutant. After a number of generations, when the mutant offspring had multiplied, a second mutant was introduced whose fitness was even greater. The histogram model also allows Shannon entropy (SE) to be monitored continuously as the information content of the total population decreases or increases. The results of these simulations illustrate, in a graphically didactic manner, the influence of natural selection, operating through relative fitness, in the emergence and dominance of a fitter mutant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La théorie de l'autocatégorisation est une théorie de psychologie sociale qui porte sur la relation entre l'individu et le groupe. Elle explique le comportement de groupe par la conception de soi et des autres en tant que membres de catégories sociales, et par l'attribution aux individus des caractéristiques prototypiques de ces catégories. Il s'agit donc d'une théorie de l'individu qui est censée expliquer des phénomènes collectifs. Les situations dans lesquelles un grand nombre d'individus interagissent de manière non triviale génèrent typiquement des comportements collectifs complexes qui sont difficiles à prévoir sur la base des comportements individuels. La simulation informatique de tels systèmes est un moyen fiable d'explorer de manière systématique la dynamique du comportement collectif en fonction des spécifications individuelles. Dans cette thèse, nous présentons un modèle formel d'une partie de la théorie de l'autocatégorisation appelée principe du métacontraste. À partir de la distribution d'un ensemble d'individus sur une ou plusieurs dimensions comparatives, le modèle génère les catégories et les prototypes associés. Nous montrons que le modèle se comporte de manière cohérente par rapport à la théorie et est capable de répliquer des données expérimentales concernant divers phénomènes de groupe, dont par exemple la polarisation. De plus, il permet de décrire systématiquement les prédictions de la théorie dont il dérive, notamment dans des situations nouvelles. Au niveau collectif, plusieurs dynamiques peuvent être observées, dont la convergence vers le consensus, vers une fragmentation ou vers l'émergence d'attitudes extrêmes. Nous étudions également l'effet du réseau social sur la dynamique et montrons qu'à l'exception de la vitesse de convergence, qui augmente lorsque les distances moyennes du réseau diminuent, les types de convergences dépendent peu du réseau choisi. Nous constatons d'autre part que les individus qui se situent à la frontière des groupes (dans le réseau social ou spatialement) ont une influence déterminante sur l'issue de la dynamique. Le modèle peut par ailleurs être utilisé comme un algorithme de classification automatique. Il identifie des prototypes autour desquels sont construits des groupes. Les prototypes sont positionnés de sorte à accentuer les caractéristiques typiques des groupes, et ne sont pas forcément centraux. Enfin, si l'on considère l'ensemble des pixels d'une image comme des individus dans un espace de couleur tridimensionnel, le modèle fournit un filtre qui permet d'atténuer du bruit, d'aider à la détection d'objets et de simuler des biais de perception comme l'induction chromatique. Abstract Self-categorization theory is a social psychology theory dealing with the relation between the individual and the group. It explains group behaviour through self- and others' conception as members of social categories, and through the attribution of the proto-typical categories' characteristics to the individuals. Hence, it is a theory of the individual that intends to explain collective phenomena. Situations involving a large number of non-trivially interacting individuals typically generate complex collective behaviours, which are difficult to anticipate on the basis of individual behaviour. Computer simulation of such systems is a reliable way of systematically exploring the dynamics of the collective behaviour depending on individual specifications. In this thesis, we present a formal model of a part of self-categorization theory named metacontrast principle. Given the distribution of a set of individuals on one or several comparison dimensions, the model generates categories and their associated prototypes. We show that the model behaves coherently with respect to the theory and is able to replicate experimental data concerning various group phenomena, for example polarization. Moreover, it allows to systematically describe the predictions of the theory from which it is derived, specially in unencountered situations. At the collective level, several dynamics can be observed, among which convergence towards consensus, towards frag-mentation or towards the emergence of extreme attitudes. We also study the effect of the social network on the dynamics and show that, except for the convergence speed which raises as the mean distances on the network decrease, the observed convergence types do not depend much on the chosen network. We further note that individuals located at the border of the groups (whether in the social network or spatially) have a decisive influence on the dynamics' issue. In addition, the model can be used as an automatic classification algorithm. It identifies prototypes around which groups are built. Prototypes are positioned such as to accentuate groups' typical characteristics and are not necessarily central. Finally, if we consider the set of pixels of an image as individuals in a three-dimensional color space, the model provides a filter that allows to lessen noise, to help detecting objects and to simulate perception biases such as chromatic induction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disturbances affect metapopulations directly through reductions in population size and indirectly through habitat modification. We consider how metapopulation persistence is affected by different disturbance regimes and the way in which disturbances spread, when metapopulations are compact or elongated, using a stochastic spatially explicit model which includes metapopulation and habitat dynamics. We discover that the risk of population extinction is larger for spatially aggregated disturbances than for spatially random disturbances. By changing the spatial configuration of the patches in the system--leading to different proportions of edge and interior patches--we demonstrate that the probability of metapopulation extinction is smaller when the metapopulation is more compact. Both of these results become more pronounced when colonization connectivity decreases. Our results have important management implication as edge patches, which are invariably considered to be less important, may play an important role as disturbance refugia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To develop a breathhold method for black-blood viability imaging of the heart that may facilitate identifying the endocardial border. MATERIALS AND METHODS: Three stimulated-echo acquisition mode (STEAM) images were obtained almost simultaneously during the same acquisition using three different demodulation values. Two of the three images were used to construct a black-blood image of the heart. The third image was a T(1)-weighted viability image that enabled detection of hyperintense infarcted myocardium after contrast agent administration. The three STEAM images were combined into one composite black-blood viability image of the heart. The composite STEAM images were compared to conventional inversion-recovery (IR) delayed hyperenhanced (DHE) images in nine human subjects studied on a 3T MRI scanner. RESULTS: STEAM images showed black-blood characteristics and a significant improvement in the blood-infarct signal-difference to noise ratio (SDNR) when compared to the IR-DHE images (34 +/- 4.1 vs. 10 +/- 2.9, mean +/- standard deviation (SD), P < 0.002). There was sufficient myocardium-infarct SDNR in the STEAM images to accurately delineate infarcted regions. The extracted infarcts demonstrated good agreement with the IR-DHE images. CONCLUSION: The STEAM black-blood property allows for better delineation of the blood-infarct border, which would enhance the fast and accurate measurement of infarct size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Attempts to use a stimulated echo acquisition mode (STEAM) in cardiac imaging are impeded by imaging artifacts that result in signal attenuation and nulling of the cardiac tissue. In this work, we present a method to reduce this artifact by acquiring two sets of stimulated echo images with two different demodulations. The resulting two images are combined to recover the signal loss and weighted to compensate for possible deformation-dependent intensity variation. Numerical simulations were used to validate the theory. Also, the proposed correction method was applied to in vivo imaging of normal volunteers (n = 6) and animal models with induced infarction (n = 3). The results show the ability of the method to recover the lost myocardial signal and generate artifact-free black-blood cardiac images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a result of sex chromosome differentiation from ancestral autosomes, male mammalian cells only contain one X chromosome. It has long been hypothesized that X-linked gene expression levels have become doubled in males to restore the original transcriptional output, and that the resulting X overexpression in females then drove the evolution of X inactivation (XCI). However, this model has never been directly tested and patterns and mechanisms of dosage compensation across different mammals and birds generally remain little understood. Here we trace the evolution of dosage compensation using extensive transcriptome data from males and females representing all major mammalian lineages and birds. Our analyses suggest that the X has become globally upregulated in marsupials, whereas we do not detect a global upregulation of this chromosome in placental mammals. However, we find that a subset of autosomal genes interacting with X-linked genes have become downregulated in placentals upon the emergence of sex chromosomes. Thus, different driving forces may underlie the evolution of XCI and the highly efficient equilibration of X expression levels between the sexes observed for both of these lineages. In the egg-laying monotremes and birds, which have partially homologous sex chromosome systems, partial upregulation of the X (Z in birds) evolved but is largely restricted to the heterogametic sex, which provides an explanation for the partially sex-biased X (Z) expression and lack of global inactivation mechanisms in these lineages. Our findings suggest that dosage reductions imposed by sex chromosome differentiation events in amniotes were resolved in strikingly different ways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two methods of differential isotopic coding of carboxylic groups have been developed to date. The first approach uses d0- or d3-methanol to convert carboxyl groups into the corresponding methyl esters. The second relies on the incorporation of two 18O atoms into the C-terminal carboxylic group during tryptic digestion of proteins in H(2)18O. However, both methods have limitations such as chromatographic separation of 1H and 2H derivatives or overlap of isotopic distributions of light and heavy forms due to small mass shifts. Here we present a new tagging approach based on the specific incorporation of sulfanilic acid into carboxylic groups. The reagent was synthesized in a heavy form (13C phenyl ring), showing no chromatographic shift and an optimal isotopic separation with a 6 Da mass shift. Moreover, sulfanilic acid allows for simplified fragmentation in matrix-assisted laser desorption/ionization (MALDI) due the charge fixation of the sulfonate group at the C-terminus of the peptide. The derivatization is simple, specific and minimizes the number of sample treatment steps that can strongly alter the sample composition. The quantification is reproducible within an order of magnitude and can be analyzed either by electrospray ionization (ESI) or MALDI. Finally, the method is able to specifically identify the C-terminal peptide of a protein by using GluC as the proteolytic enzyme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability to determine the location and relative strength of all transcription-factor binding sites in a genome is important both for a comprehensive understanding of gene regulation and for effective promoter engineering in biotechnological applications. Here we present a bioinformatically driven experimental method to accurately define the DNA-binding sequence specificity of transcription factors. A generalized profile was used as a predictive quantitative model for binding sites, and its parameters were estimated from in vitro-selected ligands using standard hidden Markov model training algorithms. Computer simulations showed that several thousand low- to medium-affinity sequences are required to generate a profile of desired accuracy. To produce data on this scale, we applied high-throughput genomics methods to the biochemical problem addressed here. A method combining systematic evolution of ligands by exponential enrichment (SELEX) and serial analysis of gene expression (SAGE) protocols was coupled to an automated quality-controlled sequence extraction procedure based on Phred quality scores. This allowed the sequencing of a database of more than 10,000 potential DNA ligands for the CTF/NFI transcription factor. The resulting binding-site model defines the sequence specificity of this protein with a high degree of accuracy not achieved earlier and thereby makes it possible to identify previously unknown regulatory sequences in genomic DNA. A covariance analysis of the selected sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Functional connectivity affects demography and gene dynamics in fragmented populations. Besides species-specific dispersal ability, the connectivity between local populations is affected by the landscape elements encountered during dispersal. Documenting these effects is thus a central issue for the conservation and management of fragmented populations. In this study, we compare the power and accuracy of three methods (partial correlations, regressions and Approximate Bayesian Computations) that use genetic distances to infer the effect of landscape upon dispersal. We use stochastic individual-based simulations of fragmented populations surrounded by landscape elements that differ in their permeability to dispersal. The power and accuracy of all three methods are good when there is a strong contrast between the permeability of different landscape elements. The power and accuracy can be further improved by restricting analyses to adjacent pairs of populations. Landscape elements that strongly impede dispersal are the easiest to identify. However, power and accuracy decrease drastically when landscape complexity increases and the contrast between the permeability of landscape elements decreases. We provide guidelines for future studies and underline the needs to evaluate or develop approaches that are more powerful.