931 resultados para Recent Structural Models
Resumo:
Recent signaling resolution models of parent–offspring conflict have provided an important framework for theoretical and empirical studies of communication and parental care. According to these models, signaling of need is stabilized by its cost. However, our computer simulations of the evolutionary dynamics of chick begging and parental investment show that in Godfray’s model the signaling equilibrium is evolutionarily unstable: populations that start at the signaling equilibrium quickly depart from it. Furthermore, the signaling and nonsignaling equilibria are linked by a continuum of equilibria where chicks above a certain condition do not signal and we show that, contrary to intuition, fitness increases monotonically as the proportion of young that signal decreases. This result forces us to reconsider much of the current literature on signaling of need and highlights the need to investigate the evolutionary stability of signaling equilibria based on the handicap principle.
Resumo:
Peer reviewed
Resumo:
There has been a recent burst of activity in the atmosphere/ocean sciences community in utilizing stable linear Langevin stochastic models for the unresolved degree of freedom in stochastic climate prediction. Here several idealized models for stochastic climate modeling are introduced and analyzed through unambiguous mathematical theory. This analysis demonstrates the potential need for more sophisticated models beyond stable linear Langevin equations. The new phenomena include the emergence of both unstable linear Langevin stochastic models for the climate mean and the need to incorporate both suitable nonlinear effects and multiplicative noise in stochastic models under appropriate circumstances. The strategy for stochastic climate modeling that emerges from this analysis is illustrated on an idealized example involving truncated barotropic flow on a beta-plane with topography and a mean flow. In this example, the effect of the original 57 degrees of freedom is well represented by a theoretically predicted stochastic model with only 3 degrees of freedom.
Resumo:
The b locus encodes a transcription factor that regulates the expression of genes that produce purple anthocyanin pigment. Different b alleles are expressed in distinct tissues, causing tissue-specific anthocyanin production. Understanding how phenotypic diversity is produced and maintained at the b locus should provide models for how other regulatory genes, including those that influence morphological traits and development, evolve. We have investigated how different levels and patterns of pigmentation have evolved by determining the phenotypic and evolutionary relationships between 18 alleles that represent the diversity of b alleles in Zea mays. Although most of these alleles have few phenotypic differences, five alleles have very distinct tissue-specific patterns of pigmentation. Superimposing the phenotypes on the molecular phylogeny reveals that the alleles with strong and distinctive patterns of expression are closely related to alleles with weak expression, implying that the distinctive patterns have arisen recently. We have identified apparent insertions in three of the five phenotypically distinct alleles, and the fourth has unique upstream restriction fragment length polymorphisms relative to closely related alleles. The insertion in B-Peru has been shown to be responsible for its unique expression and, in the other two alleles, the presence of the insertion correlates with the phenotype. These results suggest that major changes in gene expression are probably the result of large-scale changes in DNA sequence and/or structure most likely mediated by transposable elements.
Resumo:
The heroin analogue 1-methyl-4-phenylpyridinium, MPP+, both in vitro and in vivo, produces death of dopaminergic substantia nigral cells by inhibiting the mitochondrial NADH dehydrogenase multienzyme complex, producing a syndrome indistinguishable from Parkinson's disease. Similarly, a fragment of amyloid protein, Aβ1–42, is lethal to hippocampal cells, producing recent memory deficits characteristic of Alzheimer's disease. Here we show that addition of 4 mM d-β-hydroxybutyrate protected cultured mesencephalic neurons from MPP+ toxicity and hippocampal neurons from Aβ1–42 toxicity. Our previous work in heart showed that ketone bodies, normal metabolites, can correct defects in mitochondrial energy generation. The ability of ketone bodies to protect neurons in culture suggests that defects in mitochondrial energy generation contribute to the pathophysiology of both brain diseases. These findings further suggest that ketone bodies may play a therapeutic role in these most common forms of human neurodegeneration.
Resumo:
Bacterial tmRNA mediates a trans-translation reaction, which permits the recycling of stalled ribosomes and probably also contributes to the regulated expression of a subset of genes. Its action results in the addition of a small number of C-terminal amino acids to protein whose synthesis had stalled and these constitute a proteolytic recognition tag for the degradation of these incompletely synthesized proteins. Previous work has identified pseudoknots and stem–loops that are widely conserved in divergent bacteria. In the present work an alignment of tmRNA gene sequences within 13 β-proteobacteria reveals an additional sub-structure specific for this bacterial group. This sub-structure is in pseudoknot Pk2, and consists of one to two additional stem–loop(s) capped by stable GNRA tetraloop(s). Three-dimensional models of tmRNA pseudoknot 2 (Pk2) containing various topological versions of the additional sub-structure suggest that the sub-structures likely point away from the core of the RNA, containing both the tRNA and the mRNA domains. A putative tertiary interaction has also been identified.
Resumo:
Estimation of evolutionary distances has always been a major issue in the study of molecular evolution because evolutionary distances are required for estimating the rate of evolution in a gene, the divergence dates between genes or organisms, and the relationships among genes or organisms. Other closely related issues are the estimation of the pattern of nucleotide substitution, the estimation of the degree of rate variation among sites in a DNA sequence, and statistical testing of the molecular clock hypothesis. Mathematical treatments of these problems are considerably simplified by the assumption of a stationary process in which the nucleotide compositions of the sequences under study have remained approximately constant over time, and there now exist fairly extensive studies of stationary models of nucleotide substitution, although some problems remain to be solved. Nonstationary models are much more complex, but significant progress has been recently made by the development of the paralinear and LogDet distances. This paper reviews recent studies on the above issues and reports results on correcting the estimation bias of evolutionary distances, the estimation of the pattern of nucleotide substitution, and the estimation of rate variation among the sites in a sequence.
Resumo:
Self-incompatibility RNases (S-RNases) are an allelic series of style glycoproteins associated with rejection of self-pollen in solanaceous plants. The nucleotide sequences of S-RNase alleles from several genera have been determined, but the structure of the gene products has only been described for those from Nicotiana alata. We report on the N-glycan structures and the disulfide bonding of the S3-RNase from wild tomato (Lycopersicon peruvianum) and use this and other information to construct a model of this molecule. The S3-RNase has a single N-glycosylation site (Asn-28) to which one of three N-glycans is attached. S3-RNase has seven Cys residues; six are involved in disulfide linkages (Cys-16-Cys-21, Cys-46-Cys-91, and Cys-166-Cys-177), and one has a free thiol group (Cys-150). The disulfide-bonding pattern is consistent with that observed in RNase Rh, a related RNase for which radiographic-crystallographic information is available. A molecular model of the S3-RNase shows that four of the most variable regions of the S-RNases are clustered on one surface of the molecule. This is discussed in the context of recent experiments that set out to determine the regions of the S-RNase important for recognition during the self-incompatibility response.
Resumo:
Recent experiments have measured the rate of replication of DNA catalyzed by a single enzyme moving along a stretched template strand. The dependence on tension was interpreted as evidence that T7 and related DNA polymerases convert two (n = 2) or more single-stranded template bases to double helix geometry in the polymerization site during each catalytic cycle. However, we find structural data on the T7 enzyme–template complex indicate n = 1. We also present a model for the “tuning” of replication rate by mechanical tension. This model considers only local interactions in the neighborhood of the enzyme, unlike previous models that use stretching curves for the entire polymer chain. Our results, with n = 1, reconcile force-dependent replication rate studies with structural data on DNA polymerase complexes.
Resumo:
The epidermal growth factor receptor (EGFR) and p185c-neu proteins associate as dimers to create an efficient signaling assembly. Overexpression of these receptors together enhances their intrinsic kinase activity and concomitantly results in oncogenic cellular transformation. The ectodomain is able to stabilize the dimer, whereas the kinase domain mediates biological activity. Here we analyze potential interactions of the cytoplasmic kinase domains of the EGFR and p185c-neu tyrosine kinases by homology molecular modeling. This analysis indicates that kinase domains can associate as dimers and, based on intermolecular interaction calculations, that heterodimer formation is favored over homodimers. The study also predicts that the self-autophosphorylation sites located within the kinase domains are not likely to interfere with tyrosine kinase activity, but may regulate the selection of substrates, thereby modulating signal transduction. In addition, the models suggest that the kinase domains of EGFR and p185c-neu can undergo higher order aggregation such as the formation of tetramers. Formation of tetrameric complexes may explain some of the experimentally observed features of their ligand affinity and hetero-receptor internalization.
Resumo:
We summarize recent evidence that models of earthquake faults with dynamically unstable friction laws but no externally imposed heterogeneities can exhibit slip complexity. Two models are described here. The first is a one-dimensional model with velocity-weakening stick-slip friction; the second is a two-dimensional elastodynamic model with slip-weakening friction. Both exhibit small-event complexity and chaotic sequences of large characteristic events. The large events in both models are composed of Heaton pulses. We argue that the key ingredients of these models are reasonably accurate representations of the properties of real faults.
Resumo:
Recent developments in multidimensional heteronuclear NMR spectroscopy and large-scale synthesis of uniformly 13C- and 15N-labeled oligonucleotides have greatly improved the prospects for determination of the solution structure of RNA. However, there are circumstances in which it may be advantageous to label only a segment of the entire RNA chain. For example, in a larger RNA molecule the structural question of interest may reside in a localized domain. Labeling only the corresponding nucleotides simplifies the spectrum and resonance assignments because one can filter proton spectra for coupling to 13C and 15N. Another example is in resolving alternative secondary structure models that are indistinguishable in imino proton connectivities. Here we report a general method for enzymatic synthesis of quantities of segmentally labeled RNA molecules required for NMR spectroscopy. We use the method to distinguish definitively two competing secondary structure models for the 5' half of Caenorhabditis elegans spliced leader RNA by comparison of the two-dimensional [15N] 1H heteronuclear multiple quantum correlation spectrum of the uniformly labeled sample with that of a segmentally labeled sample. The method requires relatively small samples; solutions in the 200-300 microM concentration range, with a total of 30 nmol or approximately 40 micrograms of RNA in approximately 150 microliters, give strong NMR signals in a short accumulation time. The method can be adapted to label an internal segment of a larger RNA chain for study of localized structural problems. This definitive approach provides an alternative to the more common enzymatic and chemical footprinting methods for determination of RNA secondary structure.
Resumo:
This paper surveys some of the fundamental problems in natural language (NL) understanding (syntax, semantics, pragmatics, and discourse) and the current approaches to solving them. Some recent developments in NL processing include increased emphasis on corpus-based rather than example- or intuition-based work, attempts to measure the coverage and effectiveness of NL systems, dealing with discourse and dialogue phenomena, and attempts to use both analytic and stochastic knowledge. Critical areas for the future include grammars that are appropriate to processing large amounts of real language; automatic (or at least semi-automatic) methods for deriving models of syntax, semantics, and pragmatics; self-adapting systems; and integration with speech processing. Of particular importance are techniques that can be tuned to such requirements as full versus partial understanding and spoken language versus text. Portability (the ease with which one can configure an NL system for a particular application) is one of the largest barriers to application of this technology.
Resumo:
Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.
Resumo:
Há mais de uma década o controle dos níveis de preço na economia brasileira é realizado dentro do escopo do Regime de Metas de Inflação, que utiliza modelos macroeconômicos como instrumentos para guiar as tomadas de decisões sobre política monetária. Após um período de relativo êxito (2006 - 2009), nos últimos anos apesar dos esforços das autoridades monetárias na aplicação das políticas de contenção da inflação, seguindo os mandamentos do regime de metas, esta tem se mostrado resistente, provocando um debate em torno de fatores que podem estar ocasionando tal comportamento. Na literatura internacional, alguns trabalhos têm creditado aos choques de oferta, especialmente aos desencadeados pela variação dos preços das commodities, uma participação significativa na inflação, principalmente em economias onde os produtos primários figuram como maioria na pauta exportadora. Na literatura nacional, já existem alguns trabalhos que apontam nesta mesma direção. Sendo assim, buscou-se, como objetivo principal para o presente estudo, avaliar como os choques de oferta, mais especificamente os choques originados pelos preços das commodities, têm impactado na inflação brasileira e como e com que eficiência a política monetária do país tem reagido. Para tanto, foi estimado um modelo semiestrutural contendo uma curva de Phillips, uma curva IS e duas versões da Função de Reação do Banco Central, de modo a verificar como as decisões de política monetária são tomadas. O método de estimação empregado foi o de Autorregressão Vetorial com Correção de Erro (VEC) na sua versão estrutural, que permite uma avaliação dinâmica das relações de interdependência entre as variáveis do modelo proposto. Por meio da estimação da curva de Phillips foi possível observar que os choques de oferta, tanto das commodities como da produtividade do trabalho e do câmbio, não impactam a inflação imediatamente, porém sua relevância é crescente ao longo do tempo chegando a prevalecer sobre o efeito autorregressivo (indexação) verificado. Estes choques também se apresentaram importantes para o comportamento da expectativa de inflação, produzindo assim, uma indicação de que seus impactos tendem a se espalhar pelos demais setores da economia. Através dos resultados da curva IS constatou-se a forte inter-relação entre o hiato do produto e a taxa de juros, o que indica que a política monetária, por meio da fixação de tal taxa, influencia fortemente a demanda agregada. Já por meio da estimação da primeira função de reação, foi possível perceber que há uma relação contemporânea relevante entre o desvio da expectativa de inflação em relação à meta e a taxa Selic, ao passo que a relação contemporânea do hiato do produto sobre a taxa Selic se mostrou pequena. Por fim, os resultados obtidos com a segunda função de reação, confirmaram que as autoridades monetárias reagem mais fortemente aos sinais inflacionários da economia do que às movimentações que acontecem na atividade econômica e mostraram que uma elevação nos preços das commodities, em si, não provoca diretamente um aumento na taxa básica de juros da economia.