976 resultados para SEQUENCE EVOLUTION
Resumo:
BACKGROUND A cost-effective strategy to increase the density of available markers within a population is to sequence a small proportion of the population and impute whole-genome sequence data for the remaining population. Increased densities of typed markers are advantageous for genome-wide association studies (GWAS) and genomic predictions. METHODS We obtained genotypes for 54 602 SNPs (single nucleotide polymorphisms) in 1077 Franches-Montagnes (FM) horses and Illumina paired-end whole-genome sequencing data for 30 FM horses and 14 Warmblood horses. After variant calling, the sequence-derived SNP genotypes (~13 million SNPs) were used for genotype imputation with the software programs Beagle, Impute2 and FImpute. RESULTS The mean imputation accuracy of FM horses using Impute2 was 92.0%. Imputation accuracy using Beagle and FImpute was 74.3% and 77.2%, respectively. In addition, for Impute2 we determined the imputation accuracy of all individual horses in the validation population, which ranged from 85.7% to 99.8%. The subsequent inclusion of Warmblood sequence data further increased the correlation between true and imputed genotypes for most horses, especially for horses with a high level of admixture. The final imputation accuracy of the horses ranged from 91.2% to 99.5%. CONCLUSIONS Using Impute2, the imputation accuracy was higher than 91% for all horses in the validation population, which indicates that direct imputation of 50k SNP-chip data to sequence level genotypes is feasible in the FM population. The individual imputation accuracy depended mainly on the applied software and the level of admixture.
Resumo:
Staphylococcus aureus is globally one of the most important pathogens causing contagious mastitis in cattle. Previous studies using ribosomal spacer (RS)-PCR, however, demonstrated in Swiss cows that Staph. aureus isolated from bovine intramammary infections are genetically heterogeneous, with Staph. aureus genotype B (GTB) and GTC being the most prominent genotypes. Furthermore, Staph. aureus GTB was found to be contagious, whereas Staph. aureus GTC and all the remaining genotypes were involved in individual cow disease. In addition to RS-PCR, other methods for subtyping Staph. aureus are known, including spa typing and multilocus sequence typing (MLST). They are based on sequencing the spa and various housekeeping genes, respectively. The aim of the present study was to compare the 3 analytic methods using 456 strains of Staph. aureus isolated from milk of bovine intramammary infections and bulk tanks obtained from 12 European countries. Furthermore, the phylogeny of animal Staph. aureus was inferred and the zoonotic transfer of Staph. aureus between cattle and humans was studied. The analyzed strains could be grouped into 6 genotypic clusters, with CLB, CLC, and CLR being the most prominent ones. Comparing the 3 subtyping methods, RS-PCR showed the highest resolution, followed by spa typing and MLST. We found associations among the methods but in many cases they were unsatisfactory except for CLB and CLC. Cluster CLB was positive for clonal complex (CC)8 in 99% of the cases and typically positive for t2953; it is the cattle-adapted form of CC8. Cluster CLC was always positive for t529 and typically positive for CC705. For CLR and the remaining subtypes, links among the 3 methods were generally poor. Bovine Staph. aureus is highly clonal and a few clones predominate. Animal Staph. aureus always evolve from human strains, such that every human strain may be the ancestor of a novel animal-adapted strain. The zoonotic transfer of IMI- and milk-associated strains of Staph. aureus between cattle and humans seems to be very limited and different hosts are not considered as a source for mutual, spontaneous infections. Spillover events, however, may happen.
Resumo:
The evolution of landscapes crucially depends on the climate history. This is particularly evident in South America where landscape responses to orbital climate shifts have been well documented. However, while most studies have focused on inferring temperature variations from paleoclimate proxy data, estimates of water budget changes have been complicated because of a lack of adequate physical information. Here, we present a methodology and related results, which allowed us to extract water discharge values from the sedimentary record of the 40 Ka-old fluvial terrace deposits in the Pisco valley, western Peru. In particular, this valley hosts a Quaternary cut-and-fill succession that we used, in combination with beryllium-10 (10Be)-based sediment flux, gauging records, channel geometries and grain size measurements, to quantitatively assess sediment and water discharge values c. 40 Ka ago in relation to present-day conditions. We compare these discharge estimates to the discharge regime of the modern Pisco River and find that the water discharge of the paleo-Pisco River, during the Minchin pluvial period c. 40 Ka ago, was c. 7–8 times greater than the modern Pisco River if considering the mean and the maximum water discharge. In addition, the calculations show that inferred water discharge estimates are mainly dependent on channel gradients and grain size values, and to a lesser extent on channel width measures. Finally, we found that the c. 40 Ka-old Minchin terrace material was poorer sorted than the modern deposits, which might reflect that sediment transport during the past period was characterized by a larger divergence from equal mobility compared to the modern situation. In summary, the differences in grain size distribution and inferred water discharge estimates between the modern and the paleo-Pisco River suggests that the 40 Ka-old Minchin period was characterized by a wetter climate and more powerful flood events.
Resumo:
The basis for the recent transition of Enterococcus faecium from a primarily commensal organism to one of the leading causes of hospital-acquired infections in the United States is not yet understood. To address this, the first part of my project assessed isolates from early outbreaks in the USA and South America using sequence analysis, colony hybridizations, and minimal inhibitory concentrations (MICs) which showed clinical isolates possess virulence and antibiotic resistance determinants that are less abundant or lacking in community isolates. I also revealed that the level of ampicillin resistance increased over time in clinical strains. By sequencing the pbp5 gene, I demonstrated an ~5% difference in the pbp5 gene between strains with MICs <4ug/ml and those with MICs >4µg/ml, but no specific sequence changes correlated with increases in MICs within the latter group. A 3-10% nucleotide difference was also seen in three other genes analyzed, which suggested the existence of two distinct subpopulations of E. faecium. This led to the second part of my project analyzing concatenated core gene sequences, SNPs, the 16S rRNA, and phylogenetics of 21 E. faecium genomes confirming two distinct clades; a community-associated (CA) clade and hospital-associated (HA) clade. Molecular clock calculations indicate that these two clades likely diverged ~ 300,000 to > 1 million years ago, long before the modern antibiotic era. Genomic analysis also showed that, in addition to core genomic differences, HA E. faecium harbor specific accessory genetic elements that may confer selection advantages over CA E. faecium. The third part of my project discovered 6 E. faecium genes with the newly identified “WxL” domain. My analyses, using RT-PCR, western blots, patient sera, whole-cell ELISA, and immunogold electron microscopy, indicated that E. faecium WxL genes exist in operons, encode bacterial cell surface localized proteins, that WxL proteins are antigenic in humans, and are more exposed on the surface of clinical isolates versus community isolates (even though they are ubiquitous in both clades). ELISAs and BIAcore analyses also showed that proteins encoded by these operons bind several different host extracellular matrix proteins, as well as to each other, suggesting a novel cell-surface complex. In summary, my studies provide new insights into the evolution of E. faecium by showing that there are two distantly related clades; one being more successful in the hospital setting. My studies also identified operons encoding WxL proteins whose characteristics could also contribute to colonization and virulence within this species.
Resumo:
During ODP Leg 166, the recovery of cores from a transect of drill sites across the Bahamas margin from marginal to deep basin environments was an essential requirement for the study of the response of the sedimentary systems to sea-level changes. A detailed biostratigraphy based on planktonic foraminifera was performed on ODP Hole 1006A for an accurate stratigraphic control. The investigated late middle Miocene-early Pliocene sequence spans the interval from about 12.5 Ma (Biozone N12) to approximately 4.5 Ma (Biozone N19). Several bioevents calibrated with the time scale of Berggren et al. (1995a,b) were identified. The ODP Site 1006 benthic oxygen isotope stratigraphy can be correlated to the corresponding deep-water benthic oxygen isotope curve from ODP Site 846 in the Eastern Equatorial Pacific (Shackleton et al., 1995. Proc. ODP Sci. Res. 138, 337-356), which was orbitally tuned for the entire Pliocene into the latest Miocene at 6.0 Ma. The approximate stratigraphic match of the isotopic signals from both records between 4.5 and 6.0 Ma implies that the paleoceanographic signal from the Bahamas is not simply a record of regional variations but, indeed, represents glacio-eustatic fluctuations. The ODP Site 1006 oxygen and carbon isotope record, based on benthic and planktonic foraminifera, was used to define paleoceanographic changes on the margin, which could be tied to lithostratigraphic events on the Bahamas carbonate platform using seismic sequence stratigraphy. The oxygen isotope values show a general cooling trend from the middle to late Miocene, which was interrupted by a significant trend towards warmer sea-surface temperatures (SST) and associated sea-level rise with decreased ice volume during the latest Miocene. This trend reached a maximum coincident with the Miocene/Pliocene boundary. An abrupt cooling in the early Pliocene then followed the warming which continued into the earliest Pliocene. The late Miocene paleoceanographic evolution along the Bahamas margin can be observed in the ODP Site 1006 delta13C values, which support other evidence for the beginning of the closure of the Panama gateway at 8 Ma followed by a reduced intermediate water supply of water from the Pacific into the Caribbean at about 5 Ma. A general correlation of lower sedimentation rates with the major seismic sequence boundaries (SSBs) was observed. Additionally, the SSBs are associated with transitions towards more positive oxygen isotope excursions. This observed correspondence implies that the presence of a SSB, representing a density impedance contrast in the sedimentary sequence, may reflect changes in the character of the deposited sediment during highstands versus those during lowstands. However, not all of the recorded oxygen isotope excursions correspond to SSBs. The absence of a SSB in association with an oxygen isotope excursion indicates that not all oxygen isotope sea-level events impact the carbonate margin to the same extent, or maybe even represent equivalent sea-level fluctuations. Thus, it can be tentatively concluded that SSBs produced on carbonate margins do record sea-level fluctuations but not every sea-level fluctuation is represented by a SSB in the sequence stratigraphic record.
Resumo:
The main objective of this article is to focus on the analysis of teaching techniques, ranging from the use of the blackboard and chalk in old traditional classes, using slides and overhead projectors in the eighties and use of presentation software in the nineties, to the video, electronic board and network resources nowadays. Furthermore, all the aforementioned, is viewed under the different mentalities in which the teacher conditions the student using the new teaching technique, improving soft skills but maybe leading either to encouragement or disinterest, and including the lack of educational knowledge consolidation at scientific, technology and specific levels. In the same way, we study the process of adaptation required for teachers, the differences in the processes of information transfer and education towards the student, and even the existence of teachers who are not any longer appealed by their work due which has become much simpler due to new technologies and the greater ease in the development of classes due to the criteria described on the new Grade Programs adopted by the European Higher Education Area. Moreover, it is also intended to understand the evolution of students’ profiles, from the eighties to present time, in order to understand certain attitudes, behaviours, accomplishments and acknowledgements acquired over the semesters within the degree Programs. As an Educational Innovation Group, another key question also arises. What will be the learning techniques in the future?. How these evolving matters will affect both positively and negatively on the mentality, attitude, behaviour, learning, achievement of goals and satisfaction levels of all elements involved in universities’ education? Clearly, this evolution from chalk to the electronic board, the three-dimensional view of our works and their sequence, greatly facilitates the understanding and adaptation later on to the business world, but does not answer to the unknowns regarding the knowledge and the full development of achievement’s indicators in basic skills of a degree. This is the underlying question which steers the roots of the presented research.
Resumo:
Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.
Resumo:
Evolution of HIV-1 env sequences was studied in 15 seroconverting injection drug users selected for differences in the extent of CD4 T cell decline. The rates of increase of either sequence diversity at a given visit or divergence from the first seropositive visit were both higher in progressors than in nonprogressors. Viral evolution in individuals with rapid or moderate disease progression showed selection favoring nonsynonymous mutations, while nonprogressors with low viral loads selected against the nonsynonymous mutations that might have resulted in viruses with higher levels of replication. For 10 of the 15 subjects no single variant predominated over time. Evolution away from a dominant variant was followed frequently at a later time point by return to dominance of strains closely related to that variant. The observed evolutionary pattern is consistent with either selection against only the predominant virus or independent evolution occurring in different environments within the host. Differences in the level to which CD4 T cells fall in a given time period reflect not only quantitative differences in accumulation of mutations, but differences in the types of mutations that provide the best adaptation to the host environment.
Resumo:
Unmethylated CpG dinucleotides in particular base contexts (CpG-S motifs) are relatively common in bacterial DNA but are rare in vertebrate DNA. B cells and monocytes have the ability to detect such CpG-S motifs that trigger innate immune defenses with production of Th1-like cytokines. Despite comparable levels of unmethylated CpG dinucleotides, DNA from serotype 12 adenovirus is immune-stimulatory, but serotype 2 is nonstimulatory and can even inhibit activation by bacterial DNA. In type 12 genomes, the distribution of CpG-flanking bases is similar to that predicted by chance. However, in type 2 adenoviral DNA the immune stimulatory CpG-S motifs are outnumbered by a 15- to 30-fold excess of CpG dinucleotides in clusters of direct repeats or with a C on the 5′ side or a G on the 3′ side. Synthetic oligodeoxynucleotides containing these putative neutralizing (CpG-N) motifs block immune activation by CpG-S motifs in vitro and in vivo. Eliminating 52 of the 134 CpG-N motifs present in a DNA vaccine markedly enhanced its Th1-like function in vivo, which was increased further by the addition of CpG-S motifs. Thus, depending on the CpG motif, prokaryotic DNA can be either immune-stimulatory or neutralizing. These results have important implications for understanding microbial pathogenesis and molecular evolution and for the clinical development of DNA vaccines and gene therapy vectors.
Resumo:
Attempts to calibrate bacterial evolution have relied on the assumption that rates of molecular sequence divergence in bacteria are similar to those of higher eukaryotes, or to those of the few bacterial taxa for which ancestors can be reliably dated from ecological or geological evidence. Despite similarities in the substitution rates estimated for some lineages, comparisons of the relative rates of evolution at different classes of nucleotide sites indicate no basis for their universal application to all bacteria. However, there is evidence that bacteria have a constant genome-wide mutation rate on an evolutionary time scale but that this rate differs dramatically from the rate estimated by experimental methods.
Resumo:
Recombination of genes is essential to the evolution of genetic diversity, the segregation of chromosomes during cell division, and certain DNA repair processes. The Holliday junction, a four-arm, four-strand branched DNA crossover structure, is formed as a transient intermediate during genetic recombination and repair processes in the cell. The recognition and subsequent resolution of Holliday junctions into parental or recombined products appear to be critically dependent on their three-dimensional structure. Complementary NMR and time-resolved fluorescence resonance energy transfer experiments on immobilized four-arm DNA junctions reported here indicate that the Holliday junction cannot be viewed as a static structure but rather as an equilibrium mixture of two conformational isomers. Furthermore, the distribution between the two possible crossover isomers was found to depend on the sequence in a manner that was not anticipated on the basis of previous low-resolution experiments.
Resumo:
Biological speciation ultimately results in prezygotic isolation—the inability of incipient species to mate with one another–but little is understood about the selection pressures and genetic changes that generate this outcome. The genus Chlamydomonas comprises numerous species of unicellular green algae, including numerous geographic isolates of the species C. reinhardtii. This diverse collection has allowed us to analyze the evolution of two sex-related genes: the mid gene of C. reinhardtii, which determines whether a gamete is mating-type plus or minus, and the fus1 gene, which dictates a cell surface glycoprotein utilized by C. reinhardtii plus gametes to recognize minus gametes. Low stringency Southern analyses failed to detect any fus1 homologs in other Chlamydomonas species and detected only one mid homolog, documenting that both genes have diverged extensively during the evolution of the lineage. The one mid homolog was found in C. incerta, the species in culture that is most closely related to C. reinhardtii. Its mid gene carries numerous nonsynonymous and synonymous codon changes compared with the C. reinhardtii mid gene. In contrast, very high sequence conservation of both the mid and fus1 sequences is found in natural isolates of C. reinhardtii, indicating that the genes are not free to drift within a species but do diverge dramatically between species. Striking divergence of sex determination and mate recognition genes also has been encountered in a number of other eukaryotic phyla, suggesting that unique, and as yet unidentified, selection pressures act on these classes of genes during the speciation process.
Resumo:
The integrin family of cell surface receptors is strongly conserved in higher animals, but the evolutionary history of integrins is obscure. We have identified and sequenced cDNAs encoding integrin β subunits from a coral (phylum Cnidaria) and a sponge (Porifera), indicating that these proteins existed in the earliest stages of metazoan evolution. The coral βCn1 and, especially, the sponge βPo1 sequences are the most divergent of the “β1-class” integrins and share a number of features not found in any other vertebrate or invertebrate integrins. Perhaps the greatest difference from other β subunits is found in the third and fourth repeats of the cysteine-rich stalk, where the generally conserved spacings between cysteines are highly variable, but not similar, in βCn1 and βPo1. Alternatively spliced cDNAs, containing a stop codon about midway through the full-length translated sequence, were isolated from the sponge library. These cDNAs appear to define a boundary between functional domains, as they would encode a protein that includes the globular ligand-binding head but would be missing the stalk, transmembrane, and cytoplasmic domains. These and other sequence comparisons with vertebrate integrins are discussed with respect to models of integrin structure and function.
Resumo:
Peer reviewed
Resumo:
An evolutionary process is simulated with a simple spin-glass-like model of proteins to examine the origin of folding ability. At each generation, sequences are randomly mutated and subjected to a simulation of the folding process based on the model. According to the frequency of local configurations at the active sites, sequences are selected and passed to the next generation. After a few hundred generations, a sequence capable of folding globally into a native conformation emerges. Moreover, the selected sequence has a distinct energy minimum and an anisotropic funnel on the energy surface, which are the imperative features for fast folding of proteins. The proposed model reveals that the functional selection on the local configurations leads a sequence to fold globally into a conformation at a faster rate.