15 resultados para Continuous-time Markov Chain
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
We combine general equilibrium theory and théorie générale of stochastic processes to derive structural results about equilibrium state prices.
Resumo:
This paper develops a framework to test whether discrete-valued irregularly-spaced financial transactions data follow a subordinated Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Further, it turns out that, under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.
Resumo:
Chambers (1998) explores the interaction between long memory and aggregation. For continuous-time processes, he takes the aliasing effect into account when studying temporal aggregation. For discrete-time processes, however, he seems to fail to do so. This note gives the spectral density function of temporally aggregated long memory discrete-time processes in light of the aliasing effect. The results are different from those in Chambers (1998) and are supported by a small simulation exercise. As a result, the order of aggregation may not be invariant to temporal aggregation, specifically if d is negative and the aggregation is of the stock type.
Resumo:
Data available on continuous-time diffusions are always sampled discretely in time. In most cases, the likelihood function of the observations is not directly computable. This survey covers a sample of the statistical methods that have been developed to solve this problem. We concentrate on some recent contributions to the literature based on three di§erent approaches to the problem: an improvement of the Euler-Maruyama discretization scheme, the employment of Martingale Estimating Functions, and the application of Generalized Method of Moments (GMM).
Resumo:
Este trabalho analisa, sob uma perspectiva quantitativa, a retenção de clientes durante o processo de renegociação de créditos inadimplentes. O foco principal é entender quais são as variáveis que explicam a retenção destes clientes e, portanto, aprimorar o processo de cobrança de uma instituição financeira no Brasil. O tema se torna relevante à medida em que vários fatores tornam a competitividade mais difícil no ambiente de crédito no país: a concentração bancária vivida na última década, o aumento da oferta de crédito nos últimos anos, a redução dos spreads bancários, e por fim a crise econômica global que afeta em especial o setor financeiro. A pesquisa procura investigar quais variáveis melhor explicam o fenômeno da retenção. Para tanto, foram segregados clientes projetados como rentáveis pela cadeia de Markov. Em seguida, testou-se a aderência de variáveis cadastrais e contratuais à variável-resposta retenção, por duas metodologias: o algoritmo CHAID da árvore de decisão e o método stepwise da regressão logística. Os resultados indicam que o método CHAID selecionou 7 e o stepwise 8 variáveis, sendo algumas de natureza cadastral e outras que vêm do próprio contrato de renegociação. Dado que as condições do contrato influenciam a retenção e portanto o valor do cliente, sugere-se que o processo de oferta incorpore operacionalmente a noção de retenção na atividade da cobrança.
Resumo:
A determinação da taxa de juros estrutura a termo é um dos temas principais da gestão de ativos financeiros. Considerando a grande importância dos ativos financeiros para a condução das políticas econômicas, é fundamental para compreender a estrutura que é determinado. O principal objetivo deste estudo é estimar a estrutura a termo das taxas de juros brasileiras, juntamente com taxa de juros de curto prazo. A estrutura a termo será modelado com base em um modelo com uma estrutura afim. A estimativa foi feita considerando a inclusão de três fatores latentes e duas variáveis macroeconômicas, através da técnica Bayesiana da Cadeia de Monte Carlo Markov (MCMC).
Resumo:
This paper studies the long-run impact of HIV/AIDS on per capita income and education. We introduce a channel from HIV/AIDS to long-run income that has been overlooked by the literature, the reduction of the incentives to study due to shorter expected longevity. We work with a continuous time overlapping generations mo deI in which life cycle features of savings and education decision play key roles. The simulations predict that the most affected countries in Sub-Saharan Africa will be in the future, on average, a quarter poorer than they would be without AIDS, due only to the direct (human capital reduction) and indirect (decline in savings and investment) effects of life-expectancy reductions. Schooling will decline on average by half. These findings are well above previous results in the literature and indicate that, as pessimistic as they may be, at least in economic terms the worst could be yet to come.
Resumo:
This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We rst estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk-adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri-cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to con rm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The ndings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit virtually no systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).
Resumo:
A model of overlapping generations in continuous time is composed. IndividuaIs pass through two distinct time periods during their life times. During the first period, they work, save and have a death probability equal to zero. During the second, from the periods T after birth, their probability of death changes to p and then they retire. Capital stock and the stationary state in come are calculated for two situations: in the first, people live from their accumulated capital after retirementj in the second, they live from a state transfer payment through income taxo To simplify matters, in this preliminary version, it is supposed that there is no population growth and that the instantaneous elasticity substitution of consumption is unitary.
Resumo:
We present a continuous time target zone model of speculative attacks. Contrary to most of the literature that considers the certainty case, i.e., agents know for sure the Central Bank behavior in the future, we build uncertainty into the madel in two different ways. First, we consider the case in whicb the leveI of reserves at which the central bank lets the regime collapse is uncertain. Alternatively, we ana1ize the case in which, with some probability, the government may cbange its policy reducing the initially positive trend in domestic credito In both cases, contrary to the case of a fixed exchange rate regime, speculators face a cost of launching a tentative attack that may not succeed. Such cost induces a delay and may even prevent its occurrence. At the time of the tentative attack, the exchange rate moves either discretely up, if the attack succeeds, or down, if it fails. The remlts are consistent with the fact that, typically, an attack involves substantial profits and losses for the speculators. In particular, if agents believed that the government will control fiscal imbalances in the future, or alternatively, if they believe the trend in domestic credit to be temporary, the attack is postponed even in the presence of a signal of an imminent collapse. Finally, we aIso show that the timing of a speculative attack increases with the width of the target zone.
Resumo:
We develop and empirically test a continuous time equilibrium model for the pricing of oil futures. The model provides a link between no-arbitrage models and expectation oriented models. It highlights the role of inventories for the identification of different pricing regimes. In an empirical study the hedging performance of our model is compared with five other one- and two-factor pricing models. The hedging problem considered is related to Metallgesellschaft´s strategy to hedge long-term forward commitments with short-term futures. The results show that the downside risk distribution of our inventory based model stochastically dominates those of the other models.
Resumo:
The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.
Resumo:
This paper studies the long-run impact of HIV / AIDS on per capita income and education. We introduce a channel from HIV / AIDS to long-run income that has been overlooked by the literature, the reduction of the incentives to study due to shorter expected longevity. We work with a continuous time overlapping generations mo deI in which life cycle features of savings and education decision play key roles. The simulations predict that the most affected countries in Sub-Saharan Africa will be in the future, on average, a quarter poorer than they would be without AIDS, due only to the direct (human capital reduction) and indirect (decline in savings and investment) effects of life-expectancy reductions. Schooling will decline on average by half. These findings are well above previous results in the literature and indicate that, as pessimistic as they may be, at least in economic terms the worst could be yet to come.
Resumo:
Our focus is on information in expectation surveys that can now be built on thousands (or millions) of respondents on an almost continuous-time basis (big data) and in continuous macroeconomic surveys with a limited number of respondents. We show that, under standard microeconomic and econometric techniques, survey forecasts are an affine function of the conditional expectation of the target variable. This is true whether or not the survey respondent knows the data-generating process (DGP) of the target variable or the econometrician knows the respondents individual loss function. If the econometrician has a mean-squared-error risk function, we show that asymptotically efficient forecasts of the target variable can be built using Hansens (Econometrica, 1982) generalized method of moments in a panel-data context, when N and T diverge or when T diverges with N xed. Sequential asymptotic results are obtained using Phillips and Moon s (Econometrica, 1999) framework. Possible extensions are also discussed.
Resumo:
This dissertation presents two papers on how to deal with simple systemic risk measures to assess portfolio risk characteristics. The first paper deals with the Granger-causation of systemic risk indicators based in correlation matrices in stock returns. Special focus is devoted to the Eigenvalue Entropy as some previous literature indicated strong re- sults, but not considering different macroeconomic scenarios; the Index Cohesion Force and the Absorption Ratio are also considered. Considering the S&P500, there is not ev- idence of Granger-causation from Eigenvalue Entropies and the Index Cohesion Force. The Absorption Ratio Granger-caused both the S&P500 and the VIX index, being the only simple measure that passed this test. The second paper develops this measure to capture the regimes underlying the American stock market. New indicators are built using filtering and random matrix theory. The returns of the S&P500 is modelled as a mixture of normal distributions. The activation of each normal distribution is governed by a Markov chain with the transition probabilities being a function of the indicators. The model shows that using a Herfindahl-Hirschman Index of the normalized eigenval- ues exhibits best fit to the returns from 1998-2013.