11 resultados para Stochastic volatility components

em Repositório digital da Fundação Getúlio Vargas - FGV


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper develops a methodology for testing the term structure of volatility forecasts derived from stochastic volatility models, and implements it to analyze models of S&P500 index volatility. U sing measurements of the ability of volatility models to hedge and value term structure dependent option positions, we fmd that hedging tests support the Black-Scholes delta and gamma hedges, but not the simple vega hedge when there is no model of the term structure of volatility. With various models, it is difficult to improve on a simple gamma hedge assuming constant volatility. Ofthe volatility models, the GARCH components estimate of term structure is preferred. Valuation tests indicate that all the models contain term structure information not incorporated in market prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mensalmente são publicados relatórios pelo Departamento de Agricultura dos Estados Unidos (USDA) onde são divulgados dados de condições das safras, oferta e demanda globais, nível dos estoques, que servem como referência para todos os participantes do mercado de commodities agrícolas. Esse mercado apresenta uma volatilidade acentuada no período de divulgação dos relatórios. Um modelo de volatilidade estocástica com saltos é utilizado para a dinâmica de preços de milho e de soja. Não existe um modelo ‘ideal’ para tal fim, cada um dos existentes têm suas vantagens e desvantagens. O modelo escolhido foi o de Oztukel e Wilmott (1998), que é um modelo de volatilidade estocástica empírica, incrementado com saltos determinísticos. Empiricamente foi demonstrado que um modelo de volatilidade estocástica pode ser bem ajustado ao mercado de commodities, e o processo de jump-diffusion pode representar bem os saltos que o mercado apresenta durante a divulgação dos relatórios. As opções de commodities agrícolas que são negociadas em bolsa são do tipo americanas, então alguns métodos disponíveis poderiam ser utilizados para precificar opções seguindo a dinâmica do modelo proposto. Dado que o modelo escolhido é um modelo multi-fatores, então o método apropriado para a precificação é o proposto por Longstaff e Schwartz (2001) chamado de Monte Carlo por mínimos quadrados (LSM). As opções precificadas pelo modelo são utilizadas em uma estratégia de hedge de uma posição física de milho e de soja, e a eficiência dessa estratégia é comparada com estratégias utilizando-se instrumentos disponíveis no mercado.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the eff ect of aggregate uncertainty shocks on real variables. More speci fically, we introduce a shock in the volatility of productivity in an RBC model with long-run volatility risk and preferences that exhibit generalised disappointment aversion. We find that, when combined with a negative productivity shock, a volatility shock leads to further decline in real variables, such as output, consumption, hours worked and investment. For instance, out of the 2% decrease in output as a result of both shocks, we attribute 0.25% to the e ffect of an increase in volatility. We also fi nd that this e ffect is the same as the one obtained in a model with Epstein-Zin- Weil preferences, but higher than that of a model with expected utility. Moreover, GDA preferences yield superior asset pricing results, when compared to both Epstein-Zin-Weil preferences and expected utility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O uso de opções no mercado financeiro tem ganhado relevância devido ao seu payoff não-linear e a possibilidade de alterar o perfil da distribuição de retornos de um portfolio. Existem diversas estratégias que são adequadas para cada cenário que o investidor acredita estar exposto, mas como o conjunto de cenários forma uma distribuição de retornos, devemos utilizar uma medida adequada para trabalhar com este tipo de informação. Assim, foi utilizada a medida Omega, que é uma medida capaz de capturar todos os momentos de uma distribuição, dado um limiar de retornos. Este trabalho se propõe a desenvolver uma metodologia que possibilite otimizar a medida Omega de um portfolio, através do uso de opções sobre o IBOVESPA. Para a geração das distribuições de retornos foi utilizada simulação de Monte Carlo, com jumps e volatilidade estocástica. Finalmente, foram feitas diversas análises sobre os resultados obtidos, afim de comparar a estratégia otimizada com diversas estratégias aleatórias, e também, realizado um backtest para avaliar a eficácia da implementação da estratégia otimizada.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the first chapter, we test some stochastic volatility models using options on the S&P 500 index. First, we demonstrate the presence of a short time-scale, on the order of days, and a long time-scale, on the order of months, in the S&P 500 volatility process using the empirical structure function, or variogram. This result is consistent with findings of previous studies. The main contribution of our paper is to estimate the two time-scales in the volatility process simultaneously by using nonlinear weighted least-squares technique. To test the statistical significance of the rates of mean-reversion, we bootstrap pairs of residuals using the circular block bootstrap of Politis and Romano (1992). We choose the block-length according to the automatic procedure of Politis and White (2004). After that, we calculate a first-order correction to the Black-Scholes prices using three different first-order corrections: (i) a fast time scale correction; (ii) a slow time scale correction; and (iii) a multiscale (fast and slow) correction. To test the ability of our model to price options, we simulate options prices using five different specifications for the rates or mean-reversion. We did not find any evidence that these asymptotic models perform better, in terms of RMSE, than the Black-Scholes model. In the second chapter, we use Brazilian data to compute monthly idiosyncratic moments (expected skewness, realized skewness, and realized volatility) for equity returns and assess whether they are informative for the cross-section of future stock returns. Since there is evidence that lagged skewness alone does not adequately forecast skewness, we estimate a cross-sectional model of expected skewness that uses additional predictive variables. Then, we sort stocks each month according to their idiosyncratic moments, forming quintile portfolios. We find a negative relationship between higher idiosyncratic moments and next-month stock returns. The trading strategy that sells stocks in the top quintile of expected skewness and buys stocks in the bottom quintile generates a significant monthly return of about 120 basis points. Our results are robust across sample periods, portfolio weightings, and to Fama and French (1993)’s risk adjustment factors. Finally, we identify a return reversal of stocks with high idiosyncratic skewness. Specifically, stocks with high idiosyncratic skewness have high contemporaneous returns. That tends to reverse, resulting in negative abnormal returns in the following month.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Paper Tackles the Problem of Aggregate Tfp Measurement Using Stochastic Frontier Analysis (Sfa). Data From Penn World Table 6.1 are Used to Estimate a World Production Frontier For a Sample of 75 Countries Over a Long Period (1950-2000) Taking Advantage of the Model Offered By Battese and Coelli (1992). We Also Apply the Decomposition of Tfp Suggested By Bauer (1990) and Kumbhakar (2000) to a Smaller Sample of 36 Countries Over the Period 1970-2000 in Order to Evaluate the Effects of Changes in Efficiency (Technical and Allocative), Scale Effects and Technical Change. This Allows Us to Analyze the Role of Productivity and Its Components in Economic Growth of Developed and Developing Nations in Addition to the Importance of Factor Accumulation. Although not Much Explored in the Study of Economic Growth, Frontier Techniques Seem to Be of Particular Interest For That Purpose Since the Separation of Efficiency Effects and Technical Change Has a Direct Interpretation in Terms of the Catch-Up Debate. The Estimated Technical Efficiency Scores Reveal the Efficiency of Nations in the Production of Non Tradable Goods Since the Gdp Series Used is Ppp-Adjusted. We Also Provide a Second Set of Efficiency Scores Corrected in Order to Reveal Efficiency in the Production of Tradable Goods and Rank Them. When Compared to the Rankings of Productivity Indexes Offered By Non-Frontier Studies of Hall and Jones (1996) and Islam (1995) Our Ranking Shows a Somewhat More Intuitive Order of Countries. Rankings of the Technical Change and Scale Effects Components of Tfp Change are Also Very Intuitive. We Also Show That Productivity is Responsible For Virtually All the Differences of Performance Between Developed and Developing Countries in Terms of Rates of Growth of Income Per Worker. More Important, We Find That Changes in Allocative Efficiency Play a Crucial Role in Explaining Differences in the Productivity of Developed and Developing Nations, Even Larger Than the One Played By the Technology Gap

Relevância:

30.00% 30.00%

Publicador:

Resumo:

li consumption is log-Normal and is decomposed into a linear deterministic trend and a stationary cycle, a surprising result in business-cycle research is that the welfare gains of eliminating uncertainty are relatively small. A possible problem with such calculations is the dichotomy between the trend and the cyclical components of consumption. In this paper, we abandon this dichotomy in two ways. First, we decompose consumption into a deterministic trend, a stochastic trend, and a stationary cyclical component, calculating the welfare gains of cycle smoothing. Calculations are carried forward only after a careful discussion of the limitations of macroeconomic policy. Second, still under the stochastic-trend model, we incorporate a variable slope for consumption depending negatively on the overall volatility in the economy. Results are obtained for a variety of preference parameterizations, parameter values, and different macroeconomic-policy goals. They show that, once the dichotomy in the decomposition in consumption is abandoned, the welfare gains of cycle smoothing may be substantial, especially due to the volatility effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os mercados de derivativos são vistos com muita desconfiança por inúmeras pessoas. O trabalho analisa o efeito da introdução de opções sobre ações no mercado brasileiro buscando identificar uma outra justificativa para a existência destes mercados: a alteração no nível de risco dos ativos objetos destas opções. A evidência empírica encontrada neste mercado está de acordo com os resultados obtidos em outros mercados - a introdução de opções é benéfica para o investidor posto que reduz a volatilidade do ativo objeto. Existe também uma tênue indicação de que a volatilidade se torna mais estocástica com a introdução das opções.