133 resultados para Computer Simulation, Adaptive Simulations
Resumo:
We have recently shown that at isotopic steady state (13)C NMR can provide a direct measurement of glycogen concentration changes, but that the turnover of glycogen was not accessible with this protocol. The aim of the present study was to design, implement and apply a novel dual-tracer infusion protocol to simultaneously measure glycogen concentration and turnover. After reaching isotopic steady state for glycogen C1 using [1-(13)C] glucose administration, [1,6-(13)C(2)] glucose was infused such that isotopic steady state was maintained at the C1 position, but the C6 position reflected (13)C label incorporation. To overcome the large chemical shift displacement error between the C1 and C6 resonances of glycogen, we implemented 2D gradient based localization using the Fourier series window approach, in conjunction with time-domain analysis of the resulting FIDs using jMRUI. The glycogen concentration of 5.1 +/- 1.6 mM measured from the C1 position was in excellent agreement with concomitant biochemical determinations. Glycogen turnover measured from the rate of label incorporation into the C6 position of glycogen in the alpha-chloralose anesthetized rat was 0.7 micromol/g/h.
Resumo:
Protein-ligand docking has made important progress during the last decade and has become a powerful tool for drug development, opening the way to virtual high throughput screening and in silico structure-based ligand design. Despite the flattering picture that has been drawn, recent publications have shown that the docking problem is far from being solved, and that more developments are still needed to achieve high successful prediction rates and accuracy. Introducing an accurate description of the solvation effect upon binding is thought to be essential to achieve this goal. In particular, EADock uses the Generalized Born Molecular Volume 2 (GBMV2) solvent model, which has been shown to reproduce accurately the desolvation energies calculated by solving the Poisson equation. Here, the implementation of the Fast Analytical Continuum Treatment of Solvation (FACTS) as an implicit solvation model in small molecules docking calculations has been assessed using the EADock docking program. Our results strongly support the use of FACTS for docking. The success rates of EADock/FACTS and EADock/GBMV2 are similar, i.e. around 75% for local docking and 65% for blind docking. However, these results come at a much lower computational cost: FACTS is 10 times faster than GBMV2 in calculating the total electrostatic energy, and allows a speed up of EADock by a factor of 4. This study also supports the EADock development strategy relying on the CHARMM package for energy calculations, which enables straightforward implementation and testing of the latest developments in the field of Molecular Modeling.
Resumo:
In this study, a quantitative approach was used to investigate the role of D142, which belongs to the highly conserved E/DRY sequence, in the activation process of the alpha1B-adrenergic receptor (alpha1B-AR). Experimental and computer-simulated mutagenesis were performed by substituting all possible natural amino acids at the D142 site. The resulting congeneric set of proteins together with the finding that all the receptor mutants show various levels of constitutive (agonist-independent) activity enabled us to quantitatively analyze the relationships between structural/dynamic features and the extent of constitutive activity. Our results suggest that the hydrophobic/hydrophilic character of D142, which could be regulated by protonation/deprotonation of this residue, is an important modulator of the transition between the inactive (R) and active (R*) state of the alpha1B-AR. Our study represents an example of quantitative structure-activity relationship analysis of the activation process of a G protein-coupled receptor.
Resumo:
Résumé La théorie de l'autocatégorisation est une théorie de psychologie sociale qui porte sur la relation entre l'individu et le groupe. Elle explique le comportement de groupe par la conception de soi et des autres en tant que membres de catégories sociales, et par l'attribution aux individus des caractéristiques prototypiques de ces catégories. Il s'agit donc d'une théorie de l'individu qui est censée expliquer des phénomènes collectifs. Les situations dans lesquelles un grand nombre d'individus interagissent de manière non triviale génèrent typiquement des comportements collectifs complexes qui sont difficiles à prévoir sur la base des comportements individuels. La simulation informatique de tels systèmes est un moyen fiable d'explorer de manière systématique la dynamique du comportement collectif en fonction des spécifications individuelles. Dans cette thèse, nous présentons un modèle formel d'une partie de la théorie de l'autocatégorisation appelée principe du métacontraste. À partir de la distribution d'un ensemble d'individus sur une ou plusieurs dimensions comparatives, le modèle génère les catégories et les prototypes associés. Nous montrons que le modèle se comporte de manière cohérente par rapport à la théorie et est capable de répliquer des données expérimentales concernant divers phénomènes de groupe, dont par exemple la polarisation. De plus, il permet de décrire systématiquement les prédictions de la théorie dont il dérive, notamment dans des situations nouvelles. Au niveau collectif, plusieurs dynamiques peuvent être observées, dont la convergence vers le consensus, vers une fragmentation ou vers l'émergence d'attitudes extrêmes. Nous étudions également l'effet du réseau social sur la dynamique et montrons qu'à l'exception de la vitesse de convergence, qui augmente lorsque les distances moyennes du réseau diminuent, les types de convergences dépendent peu du réseau choisi. Nous constatons d'autre part que les individus qui se situent à la frontière des groupes (dans le réseau social ou spatialement) ont une influence déterminante sur l'issue de la dynamique. Le modèle peut par ailleurs être utilisé comme un algorithme de classification automatique. Il identifie des prototypes autour desquels sont construits des groupes. Les prototypes sont positionnés de sorte à accentuer les caractéristiques typiques des groupes, et ne sont pas forcément centraux. Enfin, si l'on considère l'ensemble des pixels d'une image comme des individus dans un espace de couleur tridimensionnel, le modèle fournit un filtre qui permet d'atténuer du bruit, d'aider à la détection d'objets et de simuler des biais de perception comme l'induction chromatique. Abstract Self-categorization theory is a social psychology theory dealing with the relation between the individual and the group. It explains group behaviour through self- and others' conception as members of social categories, and through the attribution of the proto-typical categories' characteristics to the individuals. Hence, it is a theory of the individual that intends to explain collective phenomena. Situations involving a large number of non-trivially interacting individuals typically generate complex collective behaviours, which are difficult to anticipate on the basis of individual behaviour. Computer simulation of such systems is a reliable way of systematically exploring the dynamics of the collective behaviour depending on individual specifications. In this thesis, we present a formal model of a part of self-categorization theory named metacontrast principle. Given the distribution of a set of individuals on one or several comparison dimensions, the model generates categories and their associated prototypes. We show that the model behaves coherently with respect to the theory and is able to replicate experimental data concerning various group phenomena, for example polarization. Moreover, it allows to systematically describe the predictions of the theory from which it is derived, specially in unencountered situations. At the collective level, several dynamics can be observed, among which convergence towards consensus, towards frag-mentation or towards the emergence of extreme attitudes. We also study the effect of the social network on the dynamics and show that, except for the convergence speed which raises as the mean distances on the network decrease, the observed convergence types do not depend much on the chosen network. We further note that individuals located at the border of the groups (whether in the social network or spatially) have a decisive influence on the dynamics' issue. In addition, the model can be used as an automatic classification algorithm. It identifies prototypes around which groups are built. Prototypes are positioned such as to accentuate groups' typical characteristics and are not necessarily central. Finally, if we consider the set of pixels of an image as individuals in a three-dimensional color space, the model provides a filter that allows to lessen noise, to help detecting objects and to simulate perception biases such as chromatic induction.
Resumo:
Disturbances affect metapopulations directly through reductions in population size and indirectly through habitat modification. We consider how metapopulation persistence is affected by different disturbance regimes and the way in which disturbances spread, when metapopulations are compact or elongated, using a stochastic spatially explicit model which includes metapopulation and habitat dynamics. We discover that the risk of population extinction is larger for spatially aggregated disturbances than for spatially random disturbances. By changing the spatial configuration of the patches in the system--leading to different proportions of edge and interior patches--we demonstrate that the probability of metapopulation extinction is smaller when the metapopulation is more compact. Both of these results become more pronounced when colonization connectivity decreases. Our results have important management implication as edge patches, which are invariably considered to be less important, may play an important role as disturbance refugia.
Resumo:
PURPOSE: To develop a breathhold method for black-blood viability imaging of the heart that may facilitate identifying the endocardial border. MATERIALS AND METHODS: Three stimulated-echo acquisition mode (STEAM) images were obtained almost simultaneously during the same acquisition using three different demodulation values. Two of the three images were used to construct a black-blood image of the heart. The third image was a T(1)-weighted viability image that enabled detection of hyperintense infarcted myocardium after contrast agent administration. The three STEAM images were combined into one composite black-blood viability image of the heart. The composite STEAM images were compared to conventional inversion-recovery (IR) delayed hyperenhanced (DHE) images in nine human subjects studied on a 3T MRI scanner. RESULTS: STEAM images showed black-blood characteristics and a significant improvement in the blood-infarct signal-difference to noise ratio (SDNR) when compared to the IR-DHE images (34 +/- 4.1 vs. 10 +/- 2.9, mean +/- standard deviation (SD), P < 0.002). There was sufficient myocardium-infarct SDNR in the STEAM images to accurately delineate infarcted regions. The extracted infarcts demonstrated good agreement with the IR-DHE images. CONCLUSION: The STEAM black-blood property allows for better delineation of the blood-infarct border, which would enhance the fast and accurate measurement of infarct size.
Resumo:
As a result of sex chromosome differentiation from ancestral autosomes, male mammalian cells only contain one X chromosome. It has long been hypothesized that X-linked gene expression levels have become doubled in males to restore the original transcriptional output, and that the resulting X overexpression in females then drove the evolution of X inactivation (XCI). However, this model has never been directly tested and patterns and mechanisms of dosage compensation across different mammals and birds generally remain little understood. Here we trace the evolution of dosage compensation using extensive transcriptome data from males and females representing all major mammalian lineages and birds. Our analyses suggest that the X has become globally upregulated in marsupials, whereas we do not detect a global upregulation of this chromosome in placental mammals. However, we find that a subset of autosomal genes interacting with X-linked genes have become downregulated in placentals upon the emergence of sex chromosomes. Thus, different driving forces may underlie the evolution of XCI and the highly efficient equilibration of X expression levels between the sexes observed for both of these lineages. In the egg-laying monotremes and birds, which have partially homologous sex chromosome systems, partial upregulation of the X (Z in birds) evolved but is largely restricted to the heterogametic sex, which provides an explanation for the partially sex-biased X (Z) expression and lack of global inactivation mechanisms in these lineages. Our findings suggest that dosage reductions imposed by sex chromosome differentiation events in amniotes were resolved in strikingly different ways.
Resumo:
PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.
Resumo:
Two methods of differential isotopic coding of carboxylic groups have been developed to date. The first approach uses d0- or d3-methanol to convert carboxyl groups into the corresponding methyl esters. The second relies on the incorporation of two 18O atoms into the C-terminal carboxylic group during tryptic digestion of proteins in H(2)18O. However, both methods have limitations such as chromatographic separation of 1H and 2H derivatives or overlap of isotopic distributions of light and heavy forms due to small mass shifts. Here we present a new tagging approach based on the specific incorporation of sulfanilic acid into carboxylic groups. The reagent was synthesized in a heavy form (13C phenyl ring), showing no chromatographic shift and an optimal isotopic separation with a 6 Da mass shift. Moreover, sulfanilic acid allows for simplified fragmentation in matrix-assisted laser desorption/ionization (MALDI) due the charge fixation of the sulfonate group at the C-terminus of the peptide. The derivatization is simple, specific and minimizes the number of sample treatment steps that can strongly alter the sample composition. The quantification is reproducible within an order of magnitude and can be analyzed either by electrospray ionization (ESI) or MALDI. Finally, the method is able to specifically identify the C-terminal peptide of a protein by using GluC as the proteolytic enzyme.
Resumo:
BACKGROUND: Physician training in smoking cessation counseling has been shown to be effective as a means to increase quit success. We assessed the cost-effectiveness ratio of a smoking cessation counseling training programme. Its effectiveness was previously demonstrated in a cluster randomized, control trial performed in two Swiss university outpatients clinics, in which residents were randomized to receive training in smoking interventions or a control educational intervention. DESIGN AND METHODS: We used a Markov simulation model for effectiveness analysis. This model incorporates the intervention efficacy, the natural quit rate, and the lifetime probability of relapse after 1-year abstinence. We used previously published results in addition to hospital service and outpatient clinic cost data. The time horizon was 1 year, and we opted for a third-party payer perspective. RESULTS: The incremental cost of the intervention amounted to US$2.58 per consultation by a smoker, translating into a cost per life-year saved of US$25.4 for men and 35.2 for women. One-way sensitivity analyses yielded a range of US$4.0-107.1 in men and US$9.7-148.6 in women. Variations in the quit rate of the control intervention, the length of training effectiveness, and the discount rate yielded moderately large effects on the outcome. Variations in the natural cessation rate, the lifetime probability of relapse, the cost of physician training, the counseling time, the cost per hour of physician time, and the cost of the booklets had little effect on the cost-effectiveness ratio. CONCLUSIONS: Training residents in smoking cessation counseling is a very cost-effective intervention and may be more efficient than currently accepted tobacco control interventions.
Resumo:
Changes in intracellular Na(+) concentration underlie essential neurobiological processes, but few reliable tools exist for their measurement. Here we characterize a new synthetic Na(+)-sensitive fluorescent dye, Asante Natrium Green (ANG), with unique properties. This indicator was excitable in the visible spectrum and by two-photon illumination, suffered little photobleaching and located to the cytosol were it remained for long durations without noticeable unwanted effects on basic cell properties. When used in brain tissue, ANG yielded a bright fluorescent signal during physiological Na(+) responses both in neurons and astrocytes. Synchronous electrophysiological and fluorometric recordings showed that ANG produced accurate Na(+) measurement in situ. This new Na(+) indicator opens innovative ways of probing neuronal circuits.
Resumo:
La projection utilise toujours le programme de simulation SIMULIT, dans sa treizième version. (...) Seule l'évolution démographique a été considérée dans les projections du nombre de lits: aucune des autres variables susceptibles de changer dans le futur n'a été prise en compte, ni celle en relation avec l'activité hospitalière elle-même (modification des taux d'hospitalisation, des durées de séjour, etc.), ni celles concernant l'état de santé de la population (modification de l'incidence ou de la prévalence des maladies). En d'autres termes, cette projection montre l'effet de l'évolution démographique sur l'activité hospitalière, si les caractéristiques de cette activité devaient rester celles observées dans les années 80. Il ne s'agit donc pas d'une prévision. [Auteurs, p. 1]
Resumo:
Intensity-modulated radiotherapy (IMRT) treatment plan verification by comparison with measured data requires having access to the linear accelerator and is time consuming. In this paper, we propose a method for monitor unit (MU) calculation and plan comparison for step and shoot IMRT based on the Monte Carlo code EGSnrc/BEAMnrc. The beamlets of an IMRT treatment plan are individually simulated using Monte Carlo and converted into absorbed dose to water per MU. The dose of the whole treatment can be expressed through a linear matrix equation of the MU and dose per MU of every beamlet. Due to the positivity of the absorbed dose and MU values, this equation is solved for the MU values using a non-negative least-squares fit optimization algorithm (NNLS). The Monte Carlo plan is formed by multiplying the Monte Carlo absorbed dose to water per MU with the Monte Carlo/NNLS MU. Several treatment plan localizations calculated with a commercial treatment planning system (TPS) are compared with the proposed method for validation. The Monte Carlo/NNLS MUs are close to the ones calculated by the TPS and lead to a treatment dose distribution which is clinically equivalent to the one calculated by the TPS. This procedure can be used as an IMRT QA and further development could allow this technique to be used for other radiotherapy techniques like tomotherapy or volumetric modulated arc therapy.
Resumo:
In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.
Resumo:
In the vast majority of bottom-up proteomics studies, protein digestion is performed using only mammalian trypsin. Although it is clearly the best enzyme available, the sole use of trypsin rarely leads to complete sequence coverage, even for abundant proteins. It is commonly assumed that this is because many tryptic peptides are either too short or too long to be identified by RPLC-MS/MS. We show through in silico analysis that 20-30% of the total sequence of three proteomes (Schizosaccharomyces pombe, Saccharomyces cerevisiae, and Homo sapiens) is expected to be covered by Large post-Trypsin Peptides (LpTPs) with M(r) above 3000 Da. We then established size exclusion chromatography to fractionate complex yeast tryptic digests into pools of peptides based on size. We found that secondary digestion of LpTPs followed by LC-MS/MS analysis leads to a significant increase in identified proteins and a 32-50% relative increase in average sequence coverage compared to trypsin digestion alone. Application of the developed strategy to analyze the phosphoproteomes of S. pombe and of a human cell line identified a significant fraction of novel phosphosites. Overall our data indicate that specific targeting of LpTPs can complement standard bottom-up workflows to reveal a largely neglected portion of the proteome.