964 resultados para Computational modelling by homology
Resumo:
As the complexity of parallel applications increase, the performance limitations resulting from computational load imbalance become dominant. Mapping the problem space to the processors in a parallel machine in a manner that balances the workload of each processors will typically reduce the run-time. In many cases the computation time required for a given calculation cannot be predetermined even at run-time and so static partition of the problem returns poor performance. For problems in which the computational load across the discretisation is dynamic and inhomogeneous, for example multi-physics problems involving fluid and solid mechanics with phase changes, the workload for a static subdomain will change over the course of a computation and cannot be estimated beforehand. For such applications the mapping of loads to process is required to change dynamically, at run-time in order to maintain reasonable efficiency. The issue of dynamic load balancing are examined in the context of PHYSICA, a three dimensional unstructured mesh multi-physics continuum mechanics computational modelling code.
Resumo:
Turbulent plasmas inside tokamaks are modeled and studied using guiding center theory, applied to charged test particles, in a Hamiltonian framework. The equations of motion for the guiding center dynamics, under the conditions of a constant and uniform magnetic field and turbulent electrostatic field are derived by averaging over the fast gyroangle, for the first and second order in the guiding center potential, using invertible changes of coordinates such as Lie transforms. The equations of motion are then made dimensionless, exploiting temporal and spatial periodicities of the model chosen for the electrostatic potential. They are implemented numerically in Python. Fast Fourier Transform and its inverse are used. Improvements to the original Python scripts are made, notably the introduction of a power-law curve fitting to account for anomalous diffusion, the possibility to integrate the equations in two steps to save computational time by removing trapped trajectories, and the implementation of multicolored stroboscopic plots to distinguish between trapped and untrapped guiding centers. The post-processing of the results is made in MATLAB. The values and ranges of the parameters chosen for the simulations are selected based on numerous simulations used as feedback tools. In particular, a recurring value for the threshold to detect trapped trajectories is evidenced. Effects of the Larmor radius, the amplitude of the guiding center potential and the intensity of its second order term are studied by analyzing their diffusive regimes, their stroboscopic plots and the shape of guiding center potentials. The main result is the identification of cases anomalous diffusion depending on the values of the parameters (mostly the Larmor radius). The transitions between diffusive regimes are identified. The presence of highways for the super-diffusive trajectories are unveiled. The influence of the charge on these transitions from diffusive to ballistic behaviors is analyzed.
Resumo:
The growth of organs and whole plants depends on both cell growth and cell-cycle progression, but the interaction between both processes is poorly understood. In plants, the balance between growth and cell-cycle progression requires coordinated regulation of four different processes: macromolecular synthesis (cytoplasmic growth), turgor-driven cell-wall extension, mitotic cycle, and endocycle. Potential feedbacks between these processes include a cell-size checkpoint operating before DNA synthesis and a link between DNA contents and maximum cell size. In addition, key intercellular signals and growth regulatory genes appear to target at the same time cell-cycle and cell-growth functions. For example, auxin, gibberellin, and brassinosteroid all have parallel links to cell-cycle progression (through S-phase Cyclin D-CDK and the anaphase-promoting complex) and cell-wall functions (through cell-wall extensibility or microtubule dynamics). Another intercellular signal mediated by microtubule dynamics is the mechanical stress caused by growth of interconnected cells. Superimposed on developmental controls, sugar signalling through the TOR pathway has recently emerged as a central control point linking cytoplasmic growth, cell-cycle and cell-wall functions. Recent progress in quantitative imaging and computational modelling will facilitate analysis of the multiple interconnections between plant cell growth and cell cycle and ultimately will be required for the predictive manipulation of plant growth.
Resumo:
Human leukocyte antigen (HLA) haplotypes are frequently evaluated for population history inferences and association studies. However, the available typing techniques for the main HLA loci usually do not allow the determination of the allele phase and the constitution of a haplotype, which may be obtained by a very time-consuming and expensive family-based segregation study. Without the family-based study, computational inference by probabilistic models is necessary to obtain haplotypes. Several authors have used the expectation-maximization (EM) algorithm to determine HLA haplotypes, but high levels of erroneous inferences are expected because of the genetic distance among the main HLA loci and the presence of several recombination hotspots. In order to evaluate the efficiency of computational inference methods, 763 unrelated individuals stratified into three different datasets had their haplotypes manually defined in a family-based study of HLA-A, -B, -DRB1 and -DQB1 segregation, and these haplotypes were compared with the data obtained by the following three methods: the Expectation-Maximization (EM) and Excoffier-Laval-Balding (ELB) algorithms using the arlequin 3.11 software, and the PHASE method. When comparing the methods, we observed that all algorithms showed a poor performance for haplotype reconstruction with distant loci, estimating incorrect haplotypes for 38%-57% of the samples considering all algorithms and datasets. We suggest that computational haplotype inferences involving low-resolution HLA-A, HLA-B, HLA-DRB1 and HLA-DQB1 haplotypes should be considered with caution.
Resumo:
Tese de Doutoramento em Biologia Ambiental e Molecular
Resumo:
The great expansion in the number of genome sequencing projects has revealed the importance of computational methods to speed up the characterization of unknown genes. These studies have been improved by the use of three dimensional information from the predicted proteins generated by molecular modeling techniques. In this work, we disclose the structure-function relationship of a gene product from Leishmania amazonensis by applying molecular modeling and bioinformatics techniques. The analyzed sequence encodes a 159 aminoacids polypeptide (estimated 18 kDa) and was denoted LaPABP for its high homology with poly-A binding proteins from trypanosomatids. The domain structure, clustering analysis and a three dimensional model of LaPABP, basically obtained by homology modeling on the structure of the human poly-A binding protein, are described. Based on the analysis of the electrostatic potential mapped on the model's surface and conservation of intramolecular contacts responsible for folding stabilization we hypothesize that this protein may have less avidity to RNA than it's L. major counterpart but still account for a significant functional activity in the parasite. The model obtained will help in the design of mutagenesis experiments aimed to elucidate the mechanism of gene expression in trypanosomatids and serve as a starting point for its exploration as a potential source of targets for a rational chemotherapy.
Resumo:
Crystallographic data about T-Cell Receptor - peptide - major histocompatibility complex class I (TCRpMHC) interaction have revealed extremely diverse TCR binding modes triggering antigen recognition. Understanding the molecular basis that governs TCR orientation over pMHC is still a considerable challenge. We present a simplified rigid approach applied on all non-redundant TCRpMHC crystal structures available. The CHARMM force field in combination with the FACTS implicit solvation model is used to study the role of long-distance interactions between the TCR and pMHC. We demonstrate that the sum of the coulomb interactions and the electrostatic solvation energies is sufficient to identify two orientations corresponding to energetic minima at 0° and 180° from the native orientation. Interestingly, these results are shown to be robust upon small structural variations of the TCR such as changes induced by Molecular Dynamics simulations, suggesting that shape complementarity is not required to obtain a reliable signal. Accurate energy minima are also identified by confronting unbound TCR crystal structures to pMHC. Furthermore, we decompose the electrostatic energy into residue contributions to estimate their role in the overall orientation. Results show that most of the driving force leading to the formation of the complex is defined by CDR1,2/MHC interactions. This long-distance contribution appears to be independent from the binding process itself, since it is reliably identified without considering neither short-range energy terms nor CDR induced fit upon binding. Ultimately, we present an attempt to predict the TCR/pMHC binding mode for a TCR structure obtained by homology modeling. The simplicity of the approach and the absence of any fitted parameters make it also easily applicable to other types of macromolecular protein complexes.
Resumo:
Transcription Activator-Like Effector Nucleases (TALEN) are potential tools for precise genome engineering of laboratory animals. We report the first targeted genomic integration in the rat using TALENs (Transcription Activator-Like Effector Nucleases) by homology-derived recombination (HDR). We assembled TALENs and designed a linear donor insert targeting a pA476T mutation in the rat Glucocorticoid Receptor (Nr3c1) namely GR(dim), that prevents receptor homodimerization in the mouse. TALEN mRNA and linear double-stranded donor were microinjected into rat one-cell embryos. Overall, we observed targeted genomic modifications in 17% of the offspring, indicating high TALEN cutting efficiency in rat zygotes.
Resumo:
In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.
Resumo:
Une variété de modèles sur le processus de prise de décision dans divers contextes présume que les sujets accumulent les évidences sensorielles, échantillonnent et intègrent constamment les signaux pour et contre des hypothèses alternatives. L'intégration continue jusqu'à ce que les évidences en faveur de l'une des hypothèses dépassent un seuil de critère de décision (niveau de preuve exigé pour prendre une décision). De nouveaux modèles suggèrent que ce processus de décision est plutôt dynamique; les différents paramètres peuvent varier entre les essais et même pendant l’essai plutôt que d’être un processus statique avec des paramètres qui ne changent qu’entre les blocs d’essais. Ce projet de doctorat a pour but de démontrer que les décisions concernant les mouvements d’atteinte impliquent un mécanisme d’accumulation temporelle des informations sensorielles menant à un seuil de décision. Pour ce faire, nous avons élaboré un paradigme de prise de décision basée sur un stimulus ambigu afin de voir si les neurones du cortex moteur primaire (M1), prémoteur dorsal (PMd) et préfrontal (DLPFc) démontrent des corrélats neuronaux de ce processus d’accumulation temporelle. Nous avons tout d’abord testé différentes versions de la tâche avec l’aide de sujets humains afin de développer une tâche où l’on observe le comportement idéal des sujets pour nous permettre de vérifier l’hypothèse de travail. Les données comportementales chez l’humain et les singes des temps de réaction et du pourcentage d'erreurs montrent une augmentation systématique avec l'augmentation de l'ambigüité du stimulus. Ces résultats sont cohérents avec les prédictions des modèles de diffusion, tel que confirmé par une modélisation computationnelle des données. Nous avons, par la suite, enregistré des cellules dans M1, PMd et DLPFc de 2 singes pendant qu'ils s'exécutaient à la tâche. Les neurones de M1 ne semblent pas être influencés par l'ambiguïté des stimuli mais déchargent plutôt en corrélation avec le mouvement exécuté. Les neurones du PMd codent la direction du mouvement choisi par les singes, assez rapidement après la présentation du stimulus. De plus, l’activation de plusieurs cellules du PMd est plus lente lorsque l'ambiguïté du stimulus augmente et prend plus de temps à signaler la direction de mouvement. L’activité des neurones du PMd reflète le choix de l’animal, peu importe si c’est une bonne réponse ou une erreur. Ceci supporte un rôle du PMd dans la prise de décision concernant les mouvements d’atteinte. Finalement, nous avons débuté des enregistrements dans le cortex préfrontal et les résultats présentés sont préliminaires. Les neurones du DLPFc semblent beaucoup plus influencés par les combinaisons des facteurs de couleur et de position spatiale que les neurones du PMd. Notre conclusion est que le cortex PMd est impliqué dans l'évaluation des évidences pour ou contre la position spatiale de différentes cibles potentielles mais assez indépendamment de la couleur de celles-ci. Le cortex DLPFc serait plutôt responsable du traitement des informations pour la combinaison de la couleur et de la position des cibles spatiales et du stimulus ambigu nécessaire pour faire le lien entre le stimulus ambigu et la cible correspondante.
Resumo:
Copolycondensation of N,N’-bis(4-hydroxybutyl)-biphenyl-3,4,3',4'-tetracarboxylic diimide at 20 and 25 mol% with bis(4-hydroxybutyl)-2,6-naphthalate produces PBN-based copoly(ester-imide)s that not only crystallise but also form a (smectic) mesophase upon cooling from the melt. Incorporation of 25 mol% imide in PBN causes the glass transition temperature (measured by DSC) to rise from 51 to 74 °C, a significant increase relative to PBN. Furthermore, increased storage- (G'), loss- (G'') and elastic (E) moduli are observed for both copoly(ester-imide)s when compared to PBN itself. Structural analysis of the 20 mol% copolymer by X-ray powder and fibre diffraction, interfaced to computational modelling, suggests a crystal structure related to that of α-PBN, in space group P-1, with cell dimensions a = 4.74, b = 6.38, c = 14.45 Å, α = 106.1, β = 122.1, γ = 97.3°, ρ = 1.37 g cm-3.
Resumo:
P>Estimates of effective elastic thickness (T(e)) for the western portion of the South American Plate using, independently, forward flexural modelling and coherence analysis, suggest different thermomechanical properties for the same continental lithosphere. We present a review of these T(e) estimates and carry out a critical reappraisal using a common methodology of 3-D finite element method to solve a differential equation for the bending of a thin elastic plate. The finite element flexural model incorporates lateral variations of T(e) and the Andes topography as the load. Three T(e) maps for the entire Andes were analysed: Stewart & Watts (1997), Tassara et al. (2007) and Perez-Gussinye et al. (2007). The predicted flexural deformation obtained for each T(e) map was compared with the depth to the base of the foreland basin sequence. Likewise, the gravity effect of flexurally induced crust-mantle deformation was compared with the observed Bouguer gravity. T(e) estimates using forward flexural modelling by Stewart & Watts (1997) better predict the geological and gravity data for most of the Andean system, particularly in the Central Andes, where T(e) ranges from greater than 70 km in the sub-Andes to less than 15 km under the Andes Cordillera. The misfit between the calculated and observed foreland basin subsidence and the gravity anomaly for the Maranon basin in Peru and the Bermejo basin in Argentina, regardless of the assumed T(e) map, may be due to a dynamic topography component associated with the shallow subduction of the Nazca Plate beneath the Andes at these latitudes.
Resumo:
The rural-urban migration phenomenon is analyzed by using an agent-based computational model. Agents are placed on lattices which dimensions varying from d = 2 up to d = 7. The localization of the agents in the lattice defines that their social neighborhood (rural or urban) is not related to their spatial distribution. The effect of the dimension of lattice is studied by analyzing the variation of the main parameters that characterizes the migratory process. The dynamics displays strong effects even for around one million of sites, in higher dimensions (d = 6, 7).
Resumo:
The texture of concrete blocks is very important and is often the decisive factor when choosing a product, particularly if the building specifications does not dispense with the high resistance of the blocks, but has the purpose of reducing costs with finishing, therefore preferring exposed blocks with a closer texture. Furthermore, a closer texture, especially for exteriors,may be the vital factor of the building's pathology.However, there is so far no standard to quantify the texture of a structural block. This article proposes to apply the freely available UTHSCSA-Image ToolTM program developed by the University of Texas Health Science Center at San Antonio to evaluate the texture of masonry blocks. One aspect that should never be overlooked when studying masonry blocks is compressive strength. Therefore, this work also gets the compressive strength of the blocks with and without the addition of lime. The addition of small quantities of lime proved beneficial for both texture and compressive strength. However, increasing the amount of lime proved to be feasible only to improve texture. © 2012 Taylor & Francis Group.
Resumo:
This paper presents a new technique to model interfaces by means of degenerated solid finite elements, i.e., elements with a very high aspect ratio, with the smallest dimension corresponding to the thickness of the interfaces. It is shown that, as the aspect ratio increases, the element strains also increase, approaching the kinematics of the strong discontinuity. A tensile damage constitutive relation between strains and stresses is proposed to describe the nonlinear behavior of the interfaces associated with crack opening. To represent crack propagation, couples of triangular interface elements are introduced in between all regular (bulk) elements of the original mesh. With this technique the analyses can be performed integrally in the context of the continuum mechanics and complex crack patterns involving multiple cracks can be simulated without the need of tracking algorithms. Numerical tests are performed to show the applicability of the proposed technique, studding also aspects related to mesh objectivity.