911 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Widespread interest in producing transgenic organisms is balanced by concern over ecological hazards, such as species extinction if such organisms were to be released into nature. An ecological risk associated with the introduction of a transgenic organism is that the transgene, though rare, can spread in a natural population. An increase in transgene frequency is often assumed to be unlikely because transgenic organisms typically have some viability disadvantage. Reduced viability is assumed to be common because transgenic individuals are best viewed as macromutants that lack any history of selection that could reduce negative fitness effects. However, these arguments ignore the potential advantageous effects of transgenes on some aspect of fitness such as mating success. Here, we examine the risk to a natural population after release of a few transgenic individuals when the transgene trait simultaneously increases transgenic male mating success and lowers the viability of transgenic offspring. We obtained relevant life history data by using the small cyprinodont fish, Japanese medaka (Oryzias latipes) as a model. Our deterministic equations predict that a transgene introduced into a natural population by a small number of transgenic fish will spread as a result of enhanced mating advantage, but the reduced viability of offspring will cause eventual local extinction of both populations. Such risks should be evaluated with each new transgenic animal before release.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has long been assumed that HIV-1 evolution is best described by deterministic evolutionary models because of the large population size. Recently, however, it was suggested that the effective population size (Ne) may be rather small, thereby allowing chance to influence evolution, a situation best described by a stochastic evolutionary model. To gain experimental evidence supporting one of the evolutionary models, we investigated whether the development of resistance to the protease inhibitor ritonavir affected the evolution of the env gene. Sequential serum samples from five patients treated with ritonavir were used for analysis of the protease gene and the V3 domain of the env gene. Multiple reverse transcription–PCR products were cloned, sequenced, and used to construct phylogenetic trees and to calculate the genetic variation and Ne. Genotypic resistance to ritonavir developed in all five patients, but each patient displayed a unique combination of mutations, indicating a stochastic element in the development of ritonavir resistance. Furthermore, development of resistance induced clear bottleneck effects in the env gene. The mean intrasample genetic variation, which ranged from 1.2% to 5.7% before treatment, decreased significantly (P < 0.025) during treatment. In agreement with these findings, Ne was estimated to be very small (500–15,000) compared with the total HIV-1 RNA copy number. This study combines three independent observations, strong population bottlenecking, small Ne, and selection of different combinations of protease-resistance mutations, all of which indicate that HIV-1 evolution is best described by a stochastic evolutionary model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dispersive wave turbulence is studied numerically for a class of one-dimensional nonlinear wave equations. Both deterministic and random (white noise in time) forcings are studied. Four distinct stable spectra are observed—the direct and inverse cascades of weak turbulence (WT) theory, thermal equilibrium, and a fourth spectrum (MMT; Majda, McLaughlin, Tabak). Each spectrum can describe long-time behavior, and each can be only metastable (with quite diverse lifetimes)—depending on details of nonlinearity, forcing, and dissipation. Cases of a long-live MMT transient state dcaying to a state with WT spectra, and vice-versa, are displayed. In the case of freely decaying turbulence, without forcing, both cascades of weak turbulence are observed. These WT states constitute the clearest and most striking numerical observations of WT spectra to date—over four decades of energy, and three decades of spatial, scales. Numerical experiments that study details of the composition, coexistence, and transition between spectra are then discussed, including: (i) for deterministic forcing, sharp distinctions between focusing and defocusing nonlinearities, including the role of long wavelength instabilities, localized coherent structures, and chaotic behavior; (ii) the role of energy growth in time to monitor the selection of MMT or WT spectra; (iii) a second manifestation of the MMT spectrum as it describes a self-similar evolution of the wave, without temporal averaging; (iv) coherent structures and the evolution of the direct and inverse cascades; and (v) nonlocality (in k-space) in the transferral process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Follicular dendritic cells (FDC) provide a reservoir for HIV type 1 (HIV-1) that may reignite infection if highly active antiretroviral therapy (HAART) is withdrawn before virus on FDC is cleared. To estimate the treatment time required to eliminate HIV-1 on FDC, we develop deterministic and stochastic models for the reversible binding of HIV-1 to FDC via ligand–receptor interactions and examine the consequences of reducing the virus available for binding to FDC. Analysis of these models shows that the rate at which HIV-1 dissociates from FDC during HAART is biphasic, with an initial period of rapid decay followed by a period of slower exponential decay. The speed of the slower second stage of dissociation and the treatment time required to eradicate the FDC reservoir of HIV-1 are insensitive to the number of virions bound and their degree of attachment to FDC before treatment. In contrast, the expected time required for dissociation of an individual virion from FDC varies sensitively with the number of ligands attached to the virion that are available to interact with receptors on FDC. Although most virions may dissociate from FDC on the time scale of days to weeks, virions coupled to a higher-than-average number of ligands may persist on FDC for years. This result suggests that HAART may not be able to clear all HIV-1 trapped on FDC and that, even if clearance is possible, years of treatment will be required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The theory of stochastic transcription termination based on free-energy competition [von Hippel, P. H. & Yager, T. D. (1992) Science 255, 809–812 and von Hippel, P. H. & Yager, T. D. (1991) Proc. Natl. Acad. Sci. USA 88, 2307–2311] requires two or more reaction rates to be delicately balanced over a wide range of physical conditions. A large body of work on glasses and large molecules suggests that this balancing should be impossible in such a large system in the absence of a new organizing principle of matter. We review the experimental literature of termination and find no evidence for such a principle, but do find many troubling inconsistencies, most notably, anomalous memory effects. These effects suggest that termination has a deterministic component and may conceivably not be stochastic at all. We find that a key experiment by Wilson and von Hippel [Wilson, K. S. & von Hippel, P. H. (1994) J. Mol. Biol. 244, 36–51] thought to demonstrate stochastic termination was an incorrectly analyzed regulatory effect of Mg2+ binding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human activities have greatly reduced the amount of the earth's area available to wild species. As the area they have left declines, so will their rates of speciation. This loss of speciation will occur for two reasons: species with larger geographical ranges speciate faster; and loss of area drives up extinction rates, thus reducing the number of species available for speciation. Theory predicts steady states in species diversity, and fossils suggest that these have typified life for most of the past 500 million years. Modern and fossil evidence indicates that, at the scale of the whole earth and its major biogeographical provinces, those steady states respond linearly, or nearly so, to available area. Hence, a loss of x% of area will produce a loss of about x% of species. Local samples of habitats merely echo the diversity available in the whole province of which they are a part. So, conservation tactics that rely on remnant patches to preserve diversity cannot succeed for long. Instead, diversity will decay to a depauperate steady state in two phases. The first will involve deterministic extinctions, reflecting the loss of all areas in which a species can ordinarily sustain its demographics. The second will be stochastic, reflecting accidents brought on by global warming, new diseases, and commingling the species of the separate bio-provinces. A new kind of conservation effort, reconciliation ecology, can avoid this decay. Reconciliation ecology discovers how to modify and diversify anthropogenic habitats so that they harbor a wide variety of species. It develops management techniques that allow humans to share their geographical range with wild species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological inference is the process of drawing conclusions about individual-level behavior from aggregate-level data. Recent advances involve the combination of statistical and deterministic means to produce such inferences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Symmetries have played an important role in a variety of problems in geology and geophysics. A large fraction of studies in mineralogy are devoted to the symmetry properties of crystals. In this paper, however, the emphasis will be on scale-invariant (fractal) symmetries. The earth’s topography is an example of both statistically self-similar and self-affine fractals. Landforms are also associated with drainage networks, which are statistical fractal trees. A universal feature of drainage networks and other growth networks is side branching. Deterministic space-filling networks with side-branching symmetries are illustrated. It is shown that naturally occurring drainage networks have symmetries similar to diffusion-limited aggregation clusters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deterministic chaos has been implicated in numerous natural and man-made complex phenomena ranging from quantum to astronomical scales and in disciplines as diverse as meteorology, physiology, ecology, and economics. However, the lack of a definitive test of chaos vs. random noise in experimental time series has led to considerable controversy in many fields. Here we propose a numerical titration procedure as a simple “litmus test” for highly sensitive, specific, and robust detection of chaos in short noisy data without the need for intensive surrogate data testing. We show that the controlled addition of white or colored noise to a signal with a preexisting noise floor results in a titration index that: (i) faithfully tracks the onset of deterministic chaos in all standard bifurcation routes to chaos; and (ii) gives a relative measure of chaos intensity. Such reliable detection and quantification of chaos under severe conditions of relatively low signal-to-noise ratio is of great interest, as it may open potential practical ways of identifying, forecasting, and controlling complex behaviors in a wide variety of physical, biomedical, and socioeconomic systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has become clear that many organisms possess the ability to regulate their mutation rate in response to environmental conditions. So the question of finding an optimal mutation rate must be replaced by that of finding an optimal mutation schedule. We show that this task cannot be accomplished with standard population-dynamic models. We then develop a "hybrid" model for populations experiencing time-dependent mutation that treats population growth as deterministic but the time of first appearance of new variants as stochastic. We show that the hybrid model agrees well with a Monte Carlo simulation. From this model, we derive a deterministic approximation, a "threshold" model, that is similar to standard population dynamic models but differs in the initial rate of generation of new mutants. We use these techniques to model antibody affinity maturation by somatic hypermutation. We had previously shown that the optimal mutation schedule for the deterministic threshold model is phasic, with periods of mutation between intervals of mutation-free growth. To establish the validity of this schedule, we now show that the phasic schedule that optimizes the deterministic threshold model significantly improves upon the best constant-rate schedule for the hybrid and Monte Carlo models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ebola virus disease is a lethal human and primate disease that requires a particular attention from the international health authorities due to important recent outbreaks in some Western African countries and isolated cases in European and North-America continents. Regarding the emergency of this situation, various decision tools, such as mathematical models, were developed to assist the authorities to focus their efforts in important factors to eradicate Ebola. In a previous work, we have proposed an original deterministic spatial-temporal model, called Be-CoDiS (Between-Countries Disease Spread), to study the evolution of human diseases within and between countries by taking into consideration the movement of people between geographical areas. This model was validated by considering numerical experiments regarding the 2014-16 West African Ebola Virus Disease epidemic. In this article, we propose to perform a stability analysis of Be-CoDiS. Our first objective is to study the equilibrium states of simplified versions of this model, limited to the cases of one an two countries, and to determine their basic reproduction ratios. Then, in order to give some recommendations for the allocation of resources used to control the disease, we perform a sensitivity analysis of those basic reproduction ratios regarding the model parameters. Finally, we validate the obtained results by considering numerical experiments based on data from the 2014-16 West African Ebola Virus Disease epidemic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In maritime transportation, decisions are made in a dynamic setting where many aspects of the future are uncertain. However, most academic literature on maritime transportation considers static and deterministic routing and scheduling problems. This work addresses a gap in the literature on dynamic and stochastic maritime routing and scheduling problems, by focusing on the scheduling of departure times. Five simple strategies for setting departure times are considered, as well as a more advanced strategy which involves solving a mixed integer mathematical programming problem. The latter strategy is significantly better than the other methods, while adding only a small computational effort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A teoria de Jean Piaget sobre o desenvolvimento da inteligência tem sido utilizada na área de inteligência computacional como inspiração para a proposição de modelos de agentes cognitivos. Embora os modelos propostos implementem aspectos básicos importantes da teoria de Piaget, como a estrutura do esquema cognitivo, não consideram o problema da fundamentação simbólica e, portanto, não se preocupam com os aspectos da teoria que levam à aquisição autônoma da semântica básica para a organização cognitiva do mundo externo, como é o caso da aquisição da noção de objeto. Neste trabalho apresentamos um modelo computacional de esquema cognitivo inspirado na teoria de Piaget sobre a inteligência sensório-motora que se desenvolve autonomamente construindo mecanismos por meio de princípios computacionais pautados pelo problema da fundamentação simbólica. O modelo de esquema proposto tem como base a classificação de situações sensório-motoras utilizadas para a percepção, captação e armazenamento das relações causais determiníscas de menor granularidade. Estas causalidades são então expandidas espaço-temporalmente por estruturas mais complexas que se utilizam das anteriores e que também são projetadas de forma a possibilitar que outras estruturas computacionais autônomas mais complexas se utilizem delas. O modelo proposto é implementado por uma rede neural artificial feed-forward cujos elementos da camada de saída se auto-organizam para gerar um grafo sensóriomotor objetivado. Alguns mecanismos computacionais já existentes na área de inteligência computacional foram modificados para se enquadrarem aos paradigmas de semântica nula e do desenvolvimento mental autônomo, tomados como base para lidar com o problema da fundamentação simbólica. O grafo sensório-motor auto-organizável que implementa um modelo de esquema inspirado na teoria de Piaget proposto neste trabalho, conjuntamente com os princípios computacionais utilizados para sua concepção caminha na direção da busca pelo desenvolvimento cognitivo artificial autônomo da noção de objeto.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Um dos aspectos regulatórios fundamentais para o mercado imobiliário no Brasil são os limites para obtenção de financiamento no Sistema Financeiro de Habitação. Esses limites podem ser definidos de forma a aumentar ou reduzir a oferta de crédito neste mercado, alterando o comportamento dos seus agentes e, com isso, o preço de mercado dos imóveis. Neste trabalho, propomos um modelo de formação de preços no mercado imobiliário brasileiro com base no comportamento dos agentes que o compõem. Os agentes vendedores têm comportamento heterogêneo e são influenciados pela demanda histórica, enquanto que os agentes compradores têm o seu comportamento determinado pela disponibilidade de crédito. Esta disponibilidade de crédito, por sua vez, é definida pelos limites para concessão de financiamento no Sistema Financeiro de Habitação. Verificamos que o processo markoviano que descreve preço de mercado converge para um sistema dinâmico determinístico quando o número de agentes aumenta, e analisamos o comportamento deste sistema dinâmico. Mostramos qual é a família de variáveis aleatórias que representa o comportamento dos agentes vendedores de forma que o sistema apresente um preço de equilíbrio não trivial, condizente com a realidade. Verificamos ainda que o preço de equilíbrio depende não só das regras de concessão de financiamento no Sistema Financeiro de Habitação, como também do preço de reserva dos compradores e da memória e da sensibilidade dos vendedores a alterações na demanda. A memória e a sensibilidade dos vendedores podem levar a oscilações de preços acima ou abaixo do preço de equilíbrio (típicas de processos de formação de bolhas); ou até mesmo a uma bifurcação de Neimark-Sacker, quando o sistema apresenta dinâmica oscilatória estável.