942 resultados para VOLATILITY SPILLOVERS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is observed in the real world that taxes matter for location decisions and that multinationals shift profits by transfer pricing. The US and Canada use so-called formula apportionment (FA) to tax corporate income, and the EU is debating a switch from separate accounting (SA) to FA. This paper develops a theoretical model that compares basic properties of FA to SA. The focal point of the analysis is how changes in tax rates affect capital formation, input choice, and transfer pricing, as well as on spillovers on tax revenue in other countries. The analysis shows that a move from SA to FA will not eliminate such spillovers and will, in cases identified in the paper, actually aggravate them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Atmospheric aerosol particles affect the global climate as well as human health. In this thesis, formation of nanometer sized atmospheric aerosol particles and their subsequent growth was observed to occur all around the world. Typical formation rate of 3 nm particles at varied from 0.01 to 10 cm-3s-1. One order of magnitude higher formation rates were detected in urban environment. Highest formation rates up to 105 cm-3s-1 were detected in coastal areas and in industrial pollution plumes. Subsequent growth rates varied from 0.01 to 20 nm h-1. Smallest growth rates were observed in polar areas and the largest in the polluted urban environment. This was probably due to competition between growth by condensation and loss by coagulation. Observed growth rates were used in the calculation of a proxy condensable vapour concentration and its source rate in vastly different environments from pristine Antarctica to polluted India. Estimated concentrations varied only 2 orders of magnitude, but the source rates for the vapours varied up to 4 orders of magnitude. Highest source rates were in New Delhi and lowest were in the Antarctica. Indirect methods were applied to study the growth of freshly formed particles in the atmosphere. Also a newly developed Water Condensation Particle Counter, TSI 3785, was found to be a potential candidate to detect water solubility and thus indirectly composition of atmospheric ultra-fine particles. Based on indirect methods, the relative roles of sulphuric acid, non-volatile material and coagulation were investigated in rural Melpitz, Germany. Condensation of non-volatile material explained 20-40% and sulphuric acid the most of the remaining growth up to a point, when nucleation mode reached 10 to 20 nm in diameter. Coagulation contributed typically less than 5%. Furthermore, hygroscopicity measurements were applied to detect the contribution of water soluble and insoluble components in Athens. During more polluted days, the water soluble components contributed more to the growth. During less anthropogenic influence, non-soluble compounds explained a larger fraction of the growth. In addition, long range transport to a measurement station in Finland in a relatively polluted air mass was found to affect the hygroscopicity of the particles. This aging could have implications to cloud formation far away from the pollution sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the most fundamental and widely accepted ideas in finance is that investors are compensated through higher returns for taking on non-diversifiable risk. Hence the quantification, modeling and prediction of risk have been, and still are one of the most prolific research areas in financial economics. It was recognized early on that there are predictable patterns in the variance of speculative prices. Later research has shown that there may also be systematic variation in the skewness and kurtosis of financial returns. Lacking in the literature so far, is an out-of-sample forecast evaluation of the potential benefits of these new more complicated models with time-varying higher moments. Such an evaluation is the topic of this dissertation. Essay 1 investigates the forecast performance of the GARCH (1,1) model when estimated with 9 different error distributions on Standard and Poor’s 500 Index Future returns. By utilizing the theory of realized variance to construct an appropriate ex post measure of variance from intra-day data it is shown that allowing for a leptokurtic error distribution leads to significant improvements in variance forecasts compared to using the normal distribution. This result holds for daily, weekly as well as monthly forecast horizons. It is also found that allowing for skewness and time variation in the higher moments of the distribution does not further improve forecasts. In Essay 2, by using 20 years of daily Standard and Poor 500 index returns, it is found that density forecasts are much improved by allowing for constant excess kurtosis but not improved by allowing for skewness. By allowing the kurtosis and skewness to be time varying the density forecasts are not further improved but on the contrary made slightly worse. In Essay 3 a new model incorporating conditional variance, skewness and kurtosis based on the Normal Inverse Gaussian (NIG) distribution is proposed. The new model and two previously used NIG models are evaluated by their Value at Risk (VaR) forecasts on a long series of daily Standard and Poor’s 500 returns. The results show that only the new model produces satisfactory VaR forecasts for both 1% and 5% VaR Taken together the results of the thesis show that kurtosis appears not to exhibit predictable time variation, whereas there is found some predictability in the skewness. However, the dynamic properties of the skewness are not completely captured by any of the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

First, in Essay 1, we test whether it is possible to forecast Finnish Options Index return volatility by examining the out-of-sample predictive ability of several common volatility models with alternative well-known methods; and find additional evidence for the predictability of volatility and for the superiority of the more complicated models over the simpler ones. Secondly, in Essay 2, the aggregated volatility of stocks listed on the Helsinki Stock Exchange is decomposed into a market, industry-and firm-level component, and it is found that firm-level (i.e., idiosyncratic) volatility has increased in time, is more substantial than the two former, predicts GDP growth, moves countercyclically and as well as the other components is persistent. Thirdly, in Essay 3, we are among the first in the literature to seek for firm-specific determinants of idiosyncratic volatility in a multivariate setting, and find for the cross-section of stocks listed on the Helsinki Stock Exchange that industrial focus, trading volume, and block ownership, are positively associated with idiosyncratic volatility estimates––obtained from both the CAPM and the Fama and French three-factor model with local and international benchmark portfolios––whereas a negative relation holds between firm age as well as size and idiosyncratic volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines how volatility in financial markets can preferable be modeled. The examination investigates how good the models for the volatility, both linear and nonlinear, are in absorbing skewness and kurtosis. The examination is done on the Nordic stock markets, including Finland, Sweden, Norway and Denmark. Different linear and nonlinear models are applied, and the results indicates that a linear model can almost always be used for modeling the series under investigation, even though nonlinear models performs slightly better in some cases. These results indicate that the markets under study are exposed to asymmetric patterns only to a certain degree. Negative shocks generally have a more prominent effect on the markets, but these effects are not really strong. However, in terms of absorbing skewness and kurtosis, nonlinear models outperform linear ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine the predictability of observed volatility smiles in three major European index options markets, utilising the historical return distributions of the respective underlying assets. The analysis involves an application of the Black (1976) pricing model adjusted in accordance with the Jarrow-Rudd methodology as proposed in 1982. Thereby we adjust the expected future returns for the third and fourth central moments as these represent deviations from normality in the distributions of observed returns. Thus, they are considered one possible explanation to the existence of the smile. The obtained results indicate that the inclusion of the higher moments in the pricing model to some extent reduces the volatility smile, compared with the unadjusted Black-76 model. However, as the smile is partly a function of supply, demand, and liquidity, and as such intricate to model, this modification does not appear sufficient to fully capture the characteristics of the smile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the intraday and weekend volatility on the German DAX. The intraday volatility is partitioned into smaller intervals and compared to a whole day’s volatility. The estimated intraday variance is U-shaped and the weekend variance is estimated to 19 % of a normal trading day. The patterns in the intraday and weekend volatility are used to develop an extension to the Black and Scholes formula to form a new time basis. Calendar or trading days are commonly used for measuring time in option pricing. The Continuous Time using Discrete Approximations model (CTDA) developed in this study uses a measure of time with smaller intervals, approaching continuous time. The model presented accounts for the lapse of time during trading only. Arbitrage pricing suggests that the option price equals the expected cost of hedging volatility during the option’s remaining life. In this model, time is allowed to lapse as volatility occurs on an intraday basis. The measure of time is modified in CTDA to correct for the non-constant volatility and to account for the patterns in volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to suggest a method that accounts for the impact of the volatility smile dynamics when performing scenario analysis for a portfolio consisting of vanilla options. As the volatility smile is documented to change at least with the level of implied at-the-money volatility, a suitable model is here included in the calculation process of the simulated market scenarios. By constructing simple portfolios of index options and comparing the ex ante risk exposure measured using different pricing methods to realized market values, ex post, the improvements of the incorporation of the model are monitored. The analyzed examples in the study generate results that statistically support that the most accurate scenarios are those calculated using the model accounting for the dynamics of the smile. Thus, we show that the differences emanating from the volatility smile are apparent and should be accounted for and that the methodology presented herein is one suitable alternative for doing so.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to investigate the pricing accuracy under stochastic volatility where the volatility follows a square root process. The theoretical prices are compared with market price data (the German DAX index options market) by using two different techniques of parameter estimation, the method of moments and implicit estimation by inversion. Standard Black & Scholes pricing is used as a benchmark. The results indicate that the stochastic volatility model with parameters estimated by inversion using the available prices on the preceding day, is the most accurate pricing method of the three in this study and can be considered satisfactory. However, as the same model with parameters estimated using a rolling window (the method of moments) proved to be inferior to the benchmark, the importance of stable and correct estimation of the parameters is evident.