45 resultados para Volatility of volatility
em Helda - Digital Repository of University of Helsinki
Resumo:
Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.
Resumo:
Atmospheric aerosol particles affect the global climate as well as human health. In this thesis, formation of nanometer sized atmospheric aerosol particles and their subsequent growth was observed to occur all around the world. Typical formation rate of 3 nm particles at varied from 0.01 to 10 cm-3s-1. One order of magnitude higher formation rates were detected in urban environment. Highest formation rates up to 105 cm-3s-1 were detected in coastal areas and in industrial pollution plumes. Subsequent growth rates varied from 0.01 to 20 nm h-1. Smallest growth rates were observed in polar areas and the largest in the polluted urban environment. This was probably due to competition between growth by condensation and loss by coagulation. Observed growth rates were used in the calculation of a proxy condensable vapour concentration and its source rate in vastly different environments from pristine Antarctica to polluted India. Estimated concentrations varied only 2 orders of magnitude, but the source rates for the vapours varied up to 4 orders of magnitude. Highest source rates were in New Delhi and lowest were in the Antarctica. Indirect methods were applied to study the growth of freshly formed particles in the atmosphere. Also a newly developed Water Condensation Particle Counter, TSI 3785, was found to be a potential candidate to detect water solubility and thus indirectly composition of atmospheric ultra-fine particles. Based on indirect methods, the relative roles of sulphuric acid, non-volatile material and coagulation were investigated in rural Melpitz, Germany. Condensation of non-volatile material explained 20-40% and sulphuric acid the most of the remaining growth up to a point, when nucleation mode reached 10 to 20 nm in diameter. Coagulation contributed typically less than 5%. Furthermore, hygroscopicity measurements were applied to detect the contribution of water soluble and insoluble components in Athens. During more polluted days, the water soluble components contributed more to the growth. During less anthropogenic influence, non-soluble compounds explained a larger fraction of the growth. In addition, long range transport to a measurement station in Finland in a relatively polluted air mass was found to affect the hygroscopicity of the particles. This aging could have implications to cloud formation far away from the pollution sources.
Resumo:
One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.
Resumo:
First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.
Resumo:
This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.
Resumo:
In this paper, we examine the predictability of observed volatility smiles in three major European index options markets, utilising the historical return distributions of the respective underlying assets. The analysis involves an application of the Black (1976) pricing model adjusted in accordance with the Jarrow-Rudd methodology as proposed in 1982. Thereby we adjust the expected future returns for the third and fourth central moments as these represent deviations from normality in the distributions of observed returns. Thus, they are considered one possible explanation to the existence of the smile. The obtained results indicate that the inclusion of the higher moments in the pricing model to some extent reduces the volatility smile, compared with the unadjusted Black-76 model. However, as the smile is partly a function of supply, demand, and liquidity, and as such intricate to model, this modification does not appear sufficient to fully capture the characteristics of the smile.
Resumo:
The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.
Resumo:
This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.
Resumo:
The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.
Resumo:
This study examines the intraday and weekend volatility on the German DAX. The intraday volatility is partitioned into smaller intervals and compared to a whole day’s volatility. The estimated intraday variance is U-shaped and the weekend variance is estimated to 19 % of a normal trading day. The patterns in the intraday and weekend volatility are used to develop an extension to the Black and Scholes formula to form a new time basis. Calendar or trading days are commonly used for measuring time in option pricing. The Continuous Time using Discrete Approximations model (CTDA) developed in this study uses a measure of time with smaller intervals, approaching continuous time. The model presented accounts for the lapse of time during trading only. Arbitrage pricing suggests that the option price equals the expected cost of hedging volatility during the option’s remaining life. In this model, time is allowed to lapse as volatility occurs on an intraday basis. The measure of time is modified in CTDA to correct for the non-constant volatility and to account for the patterns in volatility.