9 resultados para Bayesian smoothing
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.
Resumo:
The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.
Resumo:
The study aims to assess the empirical adherence of the permanent income theory and the consumption smoothing view in Latin America. Two present value models are considered, one describing household behavior and the other open economy macroeconomics. Following the methodology developed in Campbell and Schiller (1987), Bivariate Vector Autoregressions are estimated for the saving ratio and the real growth rate of income concerning the household behavior model and for the current account and the change in national cash ‡ow regarding the open economy model. The countries in the sample are considered separately in the estimation process (individual system estimation) as well as jointly (joint system estimation). Ordinary Least Squares (OLS) and Seemingly Unrelated Regressions (SURE) estimates of the coe¢cients are generated. Wald Tests are then conducted to verify if the VAR coe¢cient estimates are in conformity with those predicted by the theory. While the empirical results are sensitive to the estimation method and discount factors used, there is only weak evidence in favor of the permanent income theory and consumption smoothing view in the group of countries analyzed.
Resumo:
We transform a non co-operati ve game into a -Bayesian decision problem for each player where the uncertainty faced by a player is the strategy choices of the other players, the pr iors of other players on the choice of other players, the priors over priors and so on.We provide a complete characterization between the extent of knowledge about the rationality of players and their ability to successfulIy eliminate strategies which are not best responses. This paper therefore provides the informational foundations of iteratively unàominated strategies and rationalizable strategic behavior (Bernheim (1984) and Pearce (1984». Moreover, sufficient condi tions are also found for Nash equilibrium behavior. We also provide Aumann's (1985) results on correlated equilibria .
Resumo:
A inconsistência entre a teoria e o comportamento empírico dos agentes no que tange ao mercado privado de pensões tem se mostrado um dos mais resistentes puzzles presentes na literatura econômica. Em modelos de otimização intertemporal de consumo e poupança sob incerteza em relação ao tempo de vida dos agentes, anuidades são ativos dominantes, anulando ou restringindo fortemente a demanda por ativos cujos retornos não estão relacionados à probabilidade de sobrevivência. Na prática, entretanto, consumidores são extremamente céticos em relação às anuidades. Em oposição ao seguro contra longevidade oferecido pelas anuidades, direitos sobre esses ativos - essencialmente ilíquidos - cessam no caso de morte do titular. Nesse sentido, choques não seguráveis de liquidez e a presença de bequest motives foram consideravelmente explorados como possíveis determinantes da baixa demanda verificada. Apesar dos esforços, o puzzle persiste. Este trabalho amplia a dominância teórica das anuidades sobre ativos não contingentes em mercados incompletos; total na ausência de bequest motives, e parcial, quando os agentes se preocupam com possíveis herdeiros. Em linha com a literatura, simulações numéricas atestam que uma parcela considerável do portfolio ótimo dos agentes seria constituída de anuidades mesmo diante de choques de liquidez, bequest motives, e preços não atuarialmente justos. Em relação a um aspecto relativamente negligenciado pela academia, mostramos que o tempo ótimo de conversão de poupança em anuidades está diretamente relacionado à curva salarial dos agentes. Finalmente, indicamos que, caso as preferências dos agentes sejam tais que o nível de consumo ótimo decaia com a idade, a demanda por anuidades torna-se bastante sensível ao sobrepreço (em relação àquele atuarialmente justo) praticado pela indústria, chegando a níveis bem mais compatíveis com a realidade empírica.
Resumo:
Economias emergentes sofrem importantes restrições de crédito quando comparadas com economias desenvolvidas, entretanto, modelos estocásticos de equilíbrio geral (DSGE) desenhados para economias emergentes ainda precisam avançar nessa discussão. Nós propomos um modelo DSGE que pretende representar uma economia emergente com setor bancário baseado em Gerali et al. (2010). Nossa contribuição é considerar uma parcela da renda esperada como colateral para empréstimos das famílias. Nós estimamos o modelo proposto para o Brasil utilizando estimação Bayesiana e encontramos que economias que sofrem restrição de colateral por parte das famílias tendem a sentir o impacto de choques monetários mais rapidamente devido a exposição do setor bancário a mudanças no salário esperado.
Resumo:
li consumption is log-Normal and is decomposed into a linear deterministic trend and a stationary cycle, a surprising result in business-cycle research is that the welfare gains of eliminating uncertainty are relatively small. A possible problem with such calculations is the dichotomy between the trend and the cyclical components of consumption. In this paper, we abandon this dichotomy in two ways. First, we decompose consumption into a deterministic trend, a stochastic trend, and a stationary cyclical component, calculating the welfare gains of cycle smoothing. Calculations are carried forward only after a careful discussion of the limitations of macroeconomic policy. Second, still under the stochastic-trend model, we incorporate a variable slope for consumption depending negatively on the overall volatility in the economy. Results are obtained for a variety of preference parameterizations, parameter values, and different macroeconomic-policy goals. They show that, once the dichotomy in the decomposition in consumption is abandoned, the welfare gains of cycle smoothing may be substantial, especially due to the volatility effect.
Resumo:
The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.