836 resultados para earnings volatility
Resumo:
The predominant fear in capital markets is that of a price spike. Commodity markets differ in that there is a fear of both upward and down jumps, this results in implied volatility curves displaying distinct shapes when compared to equity markets. The use of a novel functional data analysis (FDA) approach, provides a framework to produce and interpret functional objects that characterise the underlying dynamics of oil future options. We use the FDA framework to examine implied volatility, jump risk, and pricing dynamics within crude oil markets. Examining a WTI crude oil sample for the 2007–2013 period, which includes the global financial crisis and the Arab Spring, strong evidence is found of converse jump dynamics during periods of demand and supply side weakness. This is used as a basis for an FDA-derived Merton (1976) jump diffusion optimised delta hedging strategy, which exhibits superior portfolio management results over traditional methods.
Resumo:
This paper analyses the forecastability of stock returns monthly volatility. The forecast obtained from GARCH and AGARCH models with Normal and Student's t errors are evaluated with respect to proxies for the unobserved volatility obtained through sampling at different frequencies. It is found that aggregation of daily multi-step ahead GARCH-type forecasts provide rather accurate predictions of monthly volatility.
Resumo:
This paper provides an empirical study to assess the forecasting performance of a wide range of models for predicting volatility and VaR in the Madrid Stock Exchange. The models performance was measured by using different loss functions and criteria. The results show that FIAPARCH processes capture and forecast more accurately the dynamics of IBEX-35 returns volatility. It is also observed that assuming a heavy-tailed distribution does not improve models ability for predicting volatility. However, when the aim is forecasting VaR, we find evidence of that the Student’s t FIAPARCH outperforms the models it nests the lower the target quantile.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
I investigate the impact of foreign pre-tax income on the total amount of cash held by companies and on the amount of cash that is held in companies’ foreign subsidiaries. I also investigate the impact of the existence and amount of cash held on companies’ foreign subsidiaries in the composition of cash holdings in terms of risk and liquidity. Using a sample of 100 largest U.S. non-financial and non-utilities companies I find that companies with higher earnings overseas present higher cash reserves and invest a higher fraction of their cash in risky assets. My evidence suggests that companies have a different optimization strategy for cash overseas, in which precautionary motives are not the main driver for holding cash.
Resumo:
Several papers document idiosyncratic volatility is time-varying and many attempts have been made to reveal whether idiosyncratic risk is priced. This research studies behavior of idiosyncratic volatility around information release dates and also its relation with return after public announcement. The results indicate that when a company discloses specific information to the market, firm’s specific volatility level shifts and short-horizon event-induced volatility vary significantly however, the category to which the announcement belongs is not important in magnitude of change. This event-induced volatility is not small in size and should not be downplayed in event studies. Moreover, this study shows stocks with higher contemporaneous realized idiosyncratic volatility earn lower return after public announcement consistent with “divergence of opinion hypothesis”. While no significant relation is found between EGARCH estimated idiosyncratic volatility and return and also between one-month lagged idiosyncratic volatility and return presumably due to significant jump around public announcement both may provide some signals regarding future idiosyncratic volatility through their correlations with contemporaneous realized idiosyncratic volatility. Finally, the study show that positive relation between return and idiosyncratic volatility based on under-diversification is inadequate to explain all different scenarios and this negative relation after public announcement may provide a useful trading rule.
Resumo:
Margin policy is used by regulators for the purpose of inhibiting exceSSIve volatility and stabilizing the stock market in the long run. The effect of this policy on the stock market is widely tested empirically. However, most prior studies are limited in the sense that they investigate the margin requirement for the overall stock market rather than for individual stocks, and the time periods examined are confined to the pre-1974 period as no change in the margin requirement occurred post-1974 in the U.S. This thesis intends to address the above limitations by providing a direct examination of the effect of margin requirement on return, volume, and volatility of individual companies and by using more recent data in the Canadian stock market. Using the methodologies of variance ratio test and event study with conditional volatility (EGARCH) model, we find no convincing evidence that change in margin requirement affects subsequent stock return volatility. We also find similar results for returns and trading volume. These empirical findings lead us to conclude that the use of margin policy by regulators fails to achieve the goal of inhibiting speculating activities and stabilizing volatility.
Resumo:
For predicting future volatility, empirical studies find mixed results regarding two issues: (1) whether model free implied volatility has more information content than Black-Scholes model-based implied volatility; (2) whether implied volatility outperforms historical volatilities. In this thesis, we address these two issues using the Canadian financial data. First, we examine the information content and forecasting power between VIXC - a model free implied volatility, and MVX - a model-based implied volatility. The GARCH in-sample test indicates that VIXC subsumes all information that is reflected in MVX. The out-of-sample examination indicates that VIXC is superior to MVX for predicting the next 1-, 5-, 10-, and 22-trading days' realized volatility. Second, we investigate the predictive power between VIXC and alternative volatility forecasts derived from historical index prices. We find that for time horizons lesser than 10-trading days, VIXC provides more accurate forecasts. However, for longer time horizons, the historical volatilities, particularly the random walk, provide better forecasts. We conclude that VIXC cannot incorporate all information contained in historical index prices for predicting future volatility.
Resumo:
We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.
Resumo:
In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.
Resumo:
The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.