913 resultados para Random walk model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent work on optimal monetary and fiscal policy in New Keynesian models suggests that it is optimal to allow steady-state debt to follow a random walk. Leith and Wren-Lewis (2012) consider the nature of the timeinconsistency involved in such a policy and its implication for discretionary policy-making. We show that governments are tempted, given inflationary expectations, to utilize their monetary and fiscal instruments in the initial period to change the ultimate debt burden they need to service. We demonstrate that this temptation is only eliminated if following shocks, the new steady-state debt is equal to the original (efficient) debt level even though there is no explicit debt target in the government’s objective function. Analytically and in a series of numerical simulations we show which instrument is used to stabilize the debt depends crucially on the degree of nominal inertia and the size of the debt-stock. We also show that the welfare consequences of introducing debt are negligible for precommitment policies, but can be significant for discretionary policy. Finally, we assess the credibility of commitment policy by considering a quasi-commitment policy which allows for different probabilities of reneging on past promises. This on-line Appendix extends the results of this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper employs an unobserved component model that incorporates a set of economic fundamentals to obtain the Euro-Dollar permanent equilibrium exchange rates (PEER) for the period 1975Q1 to 2008Q4. The results show that for most of the sample period, the Euro-Dollar exchange rate closely followed the values implied by the PEER. The only significant deviations from the PEER occurred in the years immediately before and after the introduction of the single European currency. The forecasting exercise shows that incorporating economic fundamentals provides a better long-run exchange rate forecasting performance than a random walk process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we study the average inter-crossing number between two random walks and two random polygons in the three-dimensional space. The random walks and polygons in this paper are the so-called equilateral random walks and polygons in which each segment of the walk or polygon is of unit length. We show that the mean average inter-crossing number ICN between two equilateral random walks of the same length n is approximately linear in terms of n and we were able to determine the prefactor of the linear term, which is a = (3 In 2)/(8) approximate to 0.2599. In the case of two random polygons of length n, the mean average inter-crossing number ICN is also linear, but the prefactor of the linear term is different from that of the random walks. These approximations apply when the starting points of the random walks and polygons are of a distance p apart and p is small compared to n. We propose a fitting model that would capture the theoretical asymptotic behaviour of the mean average ICN for large values of p. Our simulation result shows that the model in fact works very well for the entire range of p. We also study the mean ICN between two equilateral random walks and polygons of different lengths. An interesting result is that even if one random walk (polygon) has a fixed length, the mean average ICN between the two random walks (polygons) would still approach infinity if the length of the other random walk (polygon) approached infinity. The data provided by our simulations match our theoretical predictions very well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a model in which particles (or individuals of a biological population) disperse with a rest time between consecutive motions (or migrations) which may take several possible values from a discrete set. Particles (or individuals) may also react (or reproduce). We derive a new equation for the effective rest time T˜ of the random walk. Application to the neolithic transition in Europe makes it possible to derive more realistic theoretical values for its wavefront speed than those following from the single-delayed framework presented previously [J. Fort and V. Méndez, Phys. Rev. Lett. 82, 867 (1999)]. The new results are consistent with the archaeological observations of this important historical process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We generalize a previous model of time-delayed reaction–diffusion fronts (Fort and Méndez 1999 Phys. Rev. Lett. 82 867) to allow for a bias in the microscopic random walk of particles or individuals. We also present a second model which takes the time order of events (diffusion and reproduction) into account. As an example, we apply them to the human invasion front across the USA in the 19th century. The corrections relative to the previous model are substantial. Our results are relevant to physical and biological systems with anisotropic fronts, including particle diffusion in disordered lattices, population invasions, the spread of epidemics, etc

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers various asymptotic approximations in the near-integrated firstorder autoregressive model with a non-zero initial condition. We first extend the work of Knight and Satchell (1993), who considered the random walk case with a zero initial condition, to derive the expansion of the relevant joint moment generating function in this more general framework. We also consider, as alternative approximations, the stochastic expansion of Phillips (1987c) and the continuous time approximation of Perron (1991). We assess how these alternative methods provide or not an adequate approximation to the finite-sample distribution of the least-squares estimator in a first-order autoregressive model. The results show that, when the initial condition is non-zero, Perron's (1991) continuous time approximation performs very well while the others only offer improvements when the initial condition is zero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We generalize a previous model of time-delayed reaction–diffusion fronts (Fort and Méndez 1999 Phys. Rev. Lett. 82 867) to allow for a bias in the microscopic random walk of particles or individuals. We also present a second model which takes the time order of events (diffusion and reproduction) into account. As an example, we apply them to the human invasion front across the USA in the 19th century. The corrections relative to the previous model are substantial. Our results are relevant to physical and biological systems with anisotropic fronts, including particle diffusion in disordered lattices, population invasions, the spread of epidemics, etc

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report numerical results from a study of balance dynamics using a simple model of atmospheric motion that is designed to help address the question of why balance dynamics is so stable. The non-autonomous Hamiltonian model has a chaotic slow degree of freedom (representing vortical modes) coupled to one or two linear fast oscillators (representing inertia-gravity waves). The system is said to be balanced when the fast and slow degrees of freedom are separated. We find adiabatic invariants that drift slowly in time. This drift is consistent with a random-walk behaviour at a speed which qualitatively scales, even for modest time scale separations, as the upper bound given by Neishtadt’s and Nekhoroshev’s theorems. Moreover, a similar type of scaling is observed for solutions obtained using a singular perturbation (‘slaving’) technique in resonant cases where Nekhoroshev’s theorem does not apply. We present evidence that the smaller Lyapunov exponents of the system scale exponentially as well. The results suggest that the observed stability of nearly-slow motion is a consequence of the approximate adiabatic invariance of the fast motion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study random walks systems on Z whose general description follows. At time zero, there is a number N >= 1 of particles at each vertex of N, all being inactive, except for those placed at the vertex one. Each active particle performs a simple random walk on Z and, up to the time it dies, it activates all inactive particles that it meets along its way. An active particle dies at the instant it reaches a certain fixed total of jumps (L >= 1) without activating any particle, so that its lifetime depends strongly on the past of the process. We investigate how the probability of survival of the process depends on L and on the jumping probabilities of the active particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a random walks system on Z in which each active particle performs a nearest-neighbor random walk and activates all inactive particles it encounters. The movement of an active particle stops when it reaches a certain number of jumps without activating any particle. We prove that if the process relies on efficient particles (i.e. those particles with a small probability of jumping to the left) being placed strategically on Z, then it might survive, having active particles at any time with positive probability. On the other hand, we may construct a process that dies out eventually almost surely, even if it relies on efficient particles. That is, we discuss what happens if particles are initially placed very far away from each other or if their probability of jumping to the right tends to I but not fast enough.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the two-dimensional version of a drainage network model introduced ill Gangopadhyay, Roy and Sarkar (2004), and show that the appropriately rescaled family of its paths converges in distribution to the Brownian web. We do so by verifying the convergence criteria proposed in Fontes, Isopi, Newman and Ravishankar (2002).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the time evolution of an exactly solvable cellular automaton with random initial conditions both in the large-scale hydrodynamic limit and on the microscopic level. This model is a version of the totally asymmetric simple exclusion process with sublattice parallel update and thus may serve as a model for studying traffic jams in systems of self-driven particles. We study the emergence of shocks from the microscopic dynamics of the model. In particular, we introduce shock measures whose time evolution we can compute explicitly, both in the thermodynamic limit and for open boundaries where a boundary-induced phase transition driven by the motion of a shock occurs. The motion of the shock, which results from the collective dynamics of the exclusion particles, is a random walk with an internal degree of freedom that determines the jump direction. This type of hopping dynamics is reminiscent of some transport phenomena in biological systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.