976 resultados para Volatility models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the effects of the Greeks of the options and the trading results of delta hedging strategies, with three different time units or option-pricing models. These time units were calendar time, trading time and continuous time using discrete approximation (CTDA) time. The CTDA time model is a pricing model, that among others accounts for intraday and weekend, patterns in volatility. For the CTDA time model some additional theta measures, which were believed to be usable in trading, were developed. The study appears to verify that there were differences in the Greeks with different time units. It also revealed that these differences influence the delta hedging of options or portfolios. Although it is difficult to say anything about which is the most usable of the different time models, as this much depends on the traders view of the passing of time, different market conditions and different portfolios, the CTDA time model can be viewed as an attractive alternative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of prediction models is often based on ``abstract metrics'' that estimate the model's ability to limit residual errors between the observed and predicted values. However, meaningful evaluation and selection of prediction models for end-user domains requires holistic and application-sensitive performance measures. Inspired by energy consumption prediction models used in the emerging ``big data'' domain of Smart Power Grids, we propose a suite of performance measures to rationally compare models along the dimensions of scale independence, reliability, volatility and cost. We include both application independent and dependent measures, the latter parameterized to allow customization by domain experts to fit their scenario. While our measures are generalizable to other domains, we offer an empirical analysis using real energy use data for three Smart Grid applications: planning, customer education and demand response, which are relevant for energy sustainability. Our results underscore the value of the proposed measures to offer a deeper insight into models' behavior and their impact on real applications, which benefit both data mining researchers and practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Published as an article in: Investigaciones Economicas, 2005, vol. 29, issue 3, pages 483-523.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Revisions of US macroeconomic data are not white-noise. They are persistent, correlated with real-time data, and with high variability (around 80% of volatility observed in US real-time data). Their business cycle effects are examined in an estimated DSGE model extended with both real-time and final data. After implementing a Bayesian estimation approach, the role of both habit formation and price indexation fall significantly in the extended model. The results show how revision shocks of both output and inflation are expansionary because they occur when real-time published data are too low and the Fed reacts by cutting interest rates. Consumption revisions, by contrast, are countercyclical as consumption habits mirror the observed reduction in real-time consumption. In turn, revisions of the three variables explain 9.3% of changes of output in its long-run variance decomposition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper models the mean and volatility spillovers of prices within the integrated Iberian and the interconnected Spanish and French electricity markets. Using the constant (CCC) and dynamic conditional correlation (DCC) bivariate models with three different specifications of the univariate variance processes, we study the extent to which increasing interconnection and harmonization in regulation have favoured price convergence. The data consist of daily prices calculated as the arithmetic mean of the hourly prices over a span from July 1st 2007 until February 29th 2012. The DCC model in which the variances of the univariate processes are specified with a VARMA(1,1) fits the data best for the integrated MIBEL whereas a CCC model with a GARCH(1,1) specification for the univariate variance processes is selected to model the price series in Spain and France. Results show that there are significant mean and volatility spillovers in the MIBEL, indicating strong interdependence between the two markets, while there is a weaker evidence of integration between the Spanish and French markets. We provide new evidence that the EU target of achieving a single electricity market largely depends on increasing trade between countries and homogeneous rules of market functioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of time-changing variances is an important task in the modeling of financial data. Standard econometric models are often limited as they assume rigid functional relationships for the evolution of the variance. Moreover, functional parameters are usually learned by maximum likelihood, which can lead to over-fitting. To address these problems we introduce GP-Vol, a novel non-parametric model for time-changing variances based on Gaussian Processes. This new model can capture highly flexible functional relationships for the variances. Furthermore, we introduce a new online algorithm for fast inference in GP-Vol. This method is much faster than current offline inference procedures and it avoids overfitting problems by following a fully Bayesian approach. Experiments with financial data show that GP-Vol performs significantly better than current standard alternatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We firstly examine the model of Hobson and Rogers for the volatility of a financial asset such as a stock or share. The main feature of this model is the specification of volatility in terms of past price returns. The volatility process and the underlying price process share the same source of randomness and so the model is said to be complete. Complete models are advantageous as they allow a unique, preference independent price for options on the underlying price process. One of the main objectives of the model is to reproduce the `smiles' and `skews' seen in the market implied volatilities and this model produces the desired effect. In the first main piece of work we numerically calibrate the model of Hobson and Rogers for comparison with existing literature. We also develop parameter estimation methods based on the calibration of a GARCH model. We examine alternative specifications of the volatility and show an improvement of model fit to market data based on these specifications. We also show how to process market data in order to take account of inter-day movements in the volatility surface. In the second piece of work, we extend the Hobson and Rogers model in a way that better reflects market structure. We extend the model to take into account both first and second order effects. We derive and numerically solve the pde which describes the price of options under this extended model. We show that this extension allows for a better fit to the market data. Finally, we analyse the parameters of this extended model in order to understand intuitively the role of these parameters in the volatility surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit recent nonparametric asymptotic distributional results, are both easy-to-implement and highly accurate in empirically realistic situations. We also illustrate that properly accounting for the measurement errors in the volatility forecast evaluations reported in the existing literature can result in markedly higher estimates for the true degree of return volatility predictability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Empirical modeling of high-frequency currency market data reveals substantial evidence for nonnormality, stochastic volatility, and other nonlinearities. This paper investigates whether an equilibrium monetary model can account for nonlinearities in weekly data. The model incorporates time-nonseparable preferences and a transaction cost technology. Simulated sample paths are generated using Marcet's parameterized expectations procedure. The paper also develops a new method for estimation of structural economic models. The method forces the model to match (under a GMM criterion) the score function of a nonparametric estimate of the conditional density of observed data. The estimation uses weekly U.S.-German currency market data, 1975-90. © 1995.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.