959 resultados para Fluid dynamics -- Computer simulation
Resumo:
Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.
Resumo:
To study the interaction of T cell receptor with its ligand, a complex of a major histocompatibility complex molecule and a peptide, we derived H-2Kd-restricted cytolytic T lymphocyte clones from mice immunized with a Plasmodium berghei circumsporozoite peptide (PbCS) 252-260 (SYIPSAEKI) derivative containing photoreactive Nepsilon-[4-azidobenzoyl] lysine in place of Pro-255. This residue and Lys-259 were essential parts of the epitope recognized by these clones. Most of the clones expressed BV1S1A1 encoded beta chains along with specific complementary determining region (CDR) 3beta regions but diverse alpha chain sequences. Surprisingly, all T cell receptors were preferentially photoaffinity labeled on the alpha chain. For a representative T cell receptor, the photoaffinity labeled site was located in the Valpha C-strand. Computer modeling suggested the presence of a hydrophobic pocket, which is formed by parts of the Valpha/Jalpha C-, F-, and G-strands and adjacent CDR3alpha residues and structured to be able to avidly bind the photoreactive ligand side chain. We previously found that a T cell receptor specific for a PbCS peptide derivative containing this photoreactive side chain in position 259 similarly used a hydrophobic pocket located between the junctional CDR3 loops. We propose that this nonpolar domain in these locations allow T cell receptors to avidly and specifically bind epitopes containing non-peptidic side chains.
Resumo:
Recent progress in the experimental determination of protein structures allow to understand, at a very detailed level, the molecular recognition mechanisms that are at the basis of the living matter. This level of understanding makes it possible to design rational therapeutic approaches, in which effectors molecules are adapted or created de novo to perform a given function. An example of such an approach is drug design, were small inhibitory molecules are designed using in silico simulations and tested in vitro. In this article, we present a similar approach to rationally optimize the sequence of killer T lymphocytes receptors to make them more efficient against melanoma cells. The architecture of this translational research project is presented together with its implications both at the level of basic research as well as in the clinics.
Resumo:
Hidden Markov models (HMMs) are probabilistic models that are well adapted to many tasks in bioinformatics, for example, for predicting the occurrence of specific motifs in biological sequences. MAMOT is a command-line program for Unix-like operating systems, including MacOS X, that we developed to allow scientists to apply HMMs more easily in their research. One can define the architecture and initial parameters of the model in a text file and then use MAMOT for parameter optimization on example data, decoding (like predicting motif occurrence in sequences) and the production of stochastic sequences generated according to the probabilistic model. Two examples for which models are provided are coiled-coil domains in protein sequences and protein binding sites in DNA. A wealth of useful features include the use of pseudocounts, state tying and fixing of selected parameters in learning, and the inclusion of prior probabilities in decoding. AVAILABILITY: MAMOT is implemented in C++, and is distributed under the GNU General Public Licence (GPL). The software, documentation, and example model files can be found at http://bcf.isb-sib.ch/mamot
Resumo:
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Q(st)-F(st)) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2F(st)/(1 - F(st))G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2F(st)/(1 - F(st))] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Q(st)-F(st) comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions.
Resumo:
Using numerical simulations we investigate shapes of random equilateral open and closed chains, one of the simplest models of freely fluctuating polymers in a solution. We are interested in the 3D density distribution of the modeled polymers where the polymers have been aligned with respect to their three principal axes of inertia. This type of approach was pioneered by Theodorou and Suter in 1985. While individual configurations of the modeled polymers are almost always nonsymmetric, the approach of Theodorou and Suter results in cumulative shapes that are highly symmetric. By taking advantage of asymmetries within the individual configurations, we modify the procedure of aligning independent configurations in a way that shows their asymmetry. This approach reveals, for example, that the 3D density distribution for linear polymers has a bean shape predicted theoretically by Kuhn. The symmetry-breaking approach reveals complementary information to the traditional, symmetrical, 3D density distributions originally introduced by Theodorou and Suter.
Resumo:
The tendency for more closely related species to share similar traits and ecological strategies can be explained by their longer shared evolutionary histories and represents phylogenetic conservatism. How strongly species traits co-vary with phylogeny can significantly impact how we analyze cross-species data and can influence our interpretation of assembly rules in the rapidly expanding field of community phylogenetics. Phylogenetic conservatism is typically quantified by analyzing the distribution of species values on the phylogenetic tree that connects them. Many phylogenetic approaches, however, assume a completely sampled phylogeny: while we have good estimates of deeper phylogenetic relationships for many species-rich groups, such as birds and flowering plants, we often lack information on more recent interspecific relationships (i.e., within a genus). A common solution has been to represent these relationships as polytomies on trees using taxonomy as a guide. Here we show that such trees can dramatically inflate estimates of phylogenetic conservatism quantified using S. P. Blomberg et al.'s K statistic. Using simulations, we show that even randomly generated traits can appear to be phylogenetically conserved on poorly resolved trees. We provide a simple rarefaction-based solution that can reliably retrieve unbiased estimates of K, and we illustrate our method using data on first flowering times from Thoreau's woods (Concord, Massachusetts, USA).
Resumo:
The purpose of this study was to test the hypothesis that athletes having a slower oxygen uptake ( VO(2)) kinetics would benefit more, in terms of time spent near VO(2max), from an increase in the intensity of an intermittent running training (IT). After determination of VO(2max), vVO(2max) (i.e. the minimal velocity associated with VO(2max) in an incremental test) and the time to exhaustion sustained at vVO(2max) ( T(lim)), seven well-trained triathletes performed in random order two IT sessions. The two IT comprised 30-s work intervals at either 100% (IT(100%)) or 105% (IT(105%)) of vVO(2max) with 30-s recovery intervals at 50% of vVO(2max) between each repeat. The parameters of the VO(2) kinetics (td(1), tau(1), A(1), td(2), tau(2), A(2), i.e. time delay, time constant and amplitude of the primary phase and slow component, respectively) during the T(lim) test were modelled with two exponential functions. The highest VO(2) reached was significantly lower ( P<0.01) in IT(100%) run at 19.8 (0.9) km(.)h(-1) [66.2 (4.6) ml(.)min(-1.)kg(-1)] than in IT(105%) run at 20.8 (1.0) km(.)h(-1) [71.1 (4.9) ml(.)min(-1.)kg(-1)] or in the incremental test [71.2 (4.2) ml(.)min(-1.)kg(-1)]. The time sustained above 90% of VO(2max) in IT(105%) [338 (149) s] was significantly higher ( P<0.05) than in IT(100%) [168 (131) s]. The average T(lim) was 244 (39) s, tau(1) was 15.8 (5.9) s and td(2) was 96 (13) s. tau(1) was correlated with the difference in time spent above 90% of VO(2max) ( r=0.91; P<0.01) between IT(105%) and IT(100%). In conclusion, athletes with a slower VO(2) kinetics in a vVO(2max) constant-velocity test benefited more from the 5% rise of IT work intensity, exercising for longer above 90% of VO(2max) when the IT intensity was increased from 100 to 105% of vVO(2max).
Resumo:
This paper overviews the field of graphical simulators used for AUV development, presents the taxonomy of these applications and proposes a classification. It also presents Neptune, a multivehicle, real-time, graphical simulator based on OpenGL that allows hardware in the loop simulations
Resumo:
A long development time is needed from the design to the implementation of an AUV. During the first steps, simulation plays an important role, since it allows for the development of preliminary versions of the control system to be integrated. Once the robot is ready, the control systems are implemented, tuned and tested. The use of a real-time simulator can help closing the gap between off-line simulation and real testing using the already implemented robot. When properly interfaced with the robot hardware, a real-time graphical simulation with a "hardware in the loop" configuration, can allow for the testing of the implemented control system running in the actual robot hardware. Hence, the development time is drastically reduced. These paper overviews the field of graphical simulators used for AUV development proposing a classification. It also presents NEPTUNE, a multi-vehicle, real-time, graphical simulator based on OpenGL that allows hardware in the loop simulations
Resumo:
The shortest tube of constant diameter that can form a given knot represents the 'ideal' form of the knot. Ideal knots provide an irreducible representation of the knot, and they have some intriguing mathematical and physical features, including a direct correspondence with the time-averaged shapes of knotted DNA molecules in solution. Here we describe the properties of ideal forms of composite knots-knots obtained by the sequential tying of two or more independent knots (called factor knots) on the same string. We find that the writhe (related to the handedness of crossing points) of composite knots is the sum of that of the ideal forms of the factor knots. By comparing ideal composite knots with simulated configurations of knotted, thermally fluctuating DNA, we conclude that the additivity of writhe applies also to randomly distorted configurations of composite knots and their corresponding factor knots. We show that composite knots with several factor knots may possess distinct structural isomers that can be interconverted only by loosening the knot.
Resumo:
L'objet de ce cahier est de décrire la méthode de construction d'un système de "Case MiX" qui, en se fondant sur les DRG, ne décrit plus seulement la clientèle hospitalière en fonction des diagnostics principaux mais aussi des comorbidités ou complications recensées et des interventions chirurgicales subies.
Resumo:
L’objectiu d’aquest projecte és realitzar l’anàlisi, disseny i implementació d’una nova eina per analitzar les diferencies entre el paper que s’està produint i les mostres de referència, que millori els resultats obtinguts pel prototipus anterior i faciliti la interpretació dels resultats obtinguts, per part dels operaris de l’empresa. Partint de dos imatges escannejades que anomenem patró i mostra, que corresponen respectivament a les imatges de referència i de la mostra de producció
Resumo:
El Projecte s’ha desenvolupat dintre del Grup de Geometria i Gràfics de la Udg on es treballa amb projectes de desenvolupament urbanístic 3D. L’objectiu del projecte consisteix en construir una aplicació per simular l’evolució de ciutats expandint els carrers al llarg del temps. L’aplicació es desenvoluparà dintre del projecte urbanEngine incorporant la possibilitat d’expandir ciutats com una extensió d’aquest. A més es vol dissenyat una interfície gràfica d’usuari que faciliti les tasques de configuració i supervisió del sistema