964 resultados para idiosyncratic volatility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding and predicting the consequences of warming for complex ecosystems and indeed individual species remains a major ecological challenge. Here, we investigated the effect of increased seawater temperatures on the metabolic and consumption rates of five distinct marine species. The experimental species reflected different trophic positions within a typical benthic East Atlantic food web, and included a herbivorous gastropod, a scavenging decapod, a predatory echinoderm, a decapod and a benthic-feeding fish. We examined the metabolism-body mass and consumption-body mass scaling for each species, and assessed changes in their consumption efficiencies. Our results indicate that body mass and temperature effects on metabolism were inconsistent across species and that some species were unable to meet metabolic demand at higher temperatures, thus highlighting the vulnerability of individual species to warming. While body size explains a large proportion of the variation in species' physiological responses to warming, it is clear that idiosyncratic species responses, irrespective of body size, complicate predictions of population and ecosystem level response to future scenarios of climate change. © 2012 The Royal Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The predominant fear in capital markets is that of a price spike. Commodity markets differ in that there is a fear of both upward and down jumps, this results in implied volatility curves displaying distinct shapes when compared to equity markets. The use of a novel functional data analysis (FDA) approach, provides a framework to produce and interpret functional objects that characterise the underlying dynamics of oil future options. We use the FDA framework to examine implied volatility, jump risk, and pricing dynamics within crude oil markets. Examining a WTI crude oil sample for the 2007–2013 period, which includes the global financial crisis and the Arab Spring, strong evidence is found of converse jump dynamics during periods of demand and supply side weakness. This is used as a basis for an FDA-derived Merton (1976) jump diffusion optimised delta hedging strategy, which exhibits superior portfolio management results over traditional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the forecastability of stock returns monthly volatility. The forecast obtained from GARCH and AGARCH models with Normal and Student's t errors are evaluated with respect to proxies for the unobserved volatility obtained through sampling at different frequencies. It is found that aggregation of daily multi-step ahead GARCH-type forecasts provide rather accurate predictions of monthly volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper provides an empirical study to assess the forecasting performance of a wide range of models for predicting volatility and VaR in the Madrid Stock Exchange. The models performance was measured by using different loss functions and criteria. The results show that FIAPARCH processes capture and forecast more accurately the dynamics of IBEX-35 returns volatility. It is also observed that assuming a heavy-tailed distribution does not improve models ability for predicting volatility. However, when the aim is forecasting VaR, we find evidence of that the Student’s t FIAPARCH outperforms the models it nests the lower the target quantile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Margin policy is used by regulators for the purpose of inhibiting exceSSIve volatility and stabilizing the stock market in the long run. The effect of this policy on the stock market is widely tested empirically. However, most prior studies are limited in the sense that they investigate the margin requirement for the overall stock market rather than for individual stocks, and the time periods examined are confined to the pre-1974 period as no change in the margin requirement occurred post-1974 in the U.S. This thesis intends to address the above limitations by providing a direct examination of the effect of margin requirement on return, volume, and volatility of individual companies and by using more recent data in the Canadian stock market. Using the methodologies of variance ratio test and event study with conditional volatility (EGARCH) model, we find no convincing evidence that change in margin requirement affects subsequent stock return volatility. We also find similar results for returns and trading volume. These empirical findings lead us to conclude that the use of margin policy by regulators fails to achieve the goal of inhibiting speculating activities and stabilizing volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For predicting future volatility, empirical studies find mixed results regarding two issues: (1) whether model free implied volatility has more information content than Black-Scholes model-based implied volatility; (2) whether implied volatility outperforms historical volatilities. In this thesis, we address these two issues using the Canadian financial data. First, we examine the information content and forecasting power between VIXC - a model free implied volatility, and MVX - a model-based implied volatility. The GARCH in-sample test indicates that VIXC subsumes all information that is reflected in MVX. The out-of-sample examination indicates that VIXC is superior to MVX for predicting the next 1-, 5-, 10-, and 22-trading days' realized volatility. Second, we investigate the predictive power between VIXC and alternative volatility forecasts derived from historical index prices. We find that for time horizons lesser than 10-trading days, VIXC provides more accurate forecasts. However, for longer time horizons, the historical volatilities, particularly the random walk, provide better forecasts. We conclude that VIXC cannot incorporate all information contained in historical index prices for predicting future volatility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We assess the predictive ability of three VPIN metrics on the basis of two highly volatile market events of China, and examine the association between VPIN and toxic-induced volatility through conditional probability analysis and multiple regression. We examine the dynamic relationship on VPIN and high-frequency liquidity using Vector Auto-Regression models, Granger Causality tests, and impulse response analysis. Our results suggest that Bulk Volume VPIN has the best risk-warning effect among major VPIN metrics. VPIN has a positive association with market volatility induced by toxic information flow. Most importantly, we document a positive feedback effect between VPIN and high-frequency liquidity, where a negative liquidity shock boosts up VPIN, which, in turn, leads to further liquidity drain. Our study provides empirical evidence that reflects an intrinsic game between informed traders and market makers when facing toxic information in the high-frequency trading world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent work suggests that the conditional variance of financial returns may exhibit sudden jumps. This paper extends a non-parametric procedure to detect discontinuities in otherwise continuous functions of a random variable developed by Delgado and Hidalgo (1996) to higher conditional moments, in particular the conditional variance. Simulation results show that the procedure provides reasonable estimates of the number and location of jumps. This procedure detects several jumps in the conditional variance of daily returns on the S&P 500 index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note develops general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit the recent asymptotic distributional results in Barndorff-Nielsen and Shephard (2002a), are both easy to implement and highly accurate in empirically realistic situations. On properly accounting for the measurement errors in the volatility forecast evaluations reported in Andersen, Bollerslev, Diebold and Labys (2003), the adjustments result in markedly higher estimates for the true degree of return-volatility predictability.