113 resultados para multivariate stochastic volatility
Resumo:
The problem of adjusting the weights (learning) in multilayer feedforward neural networks (NN) is known to be of a high importance when utilizing NN techniques in various practical applications. The learning procedure is to be performed as fast as possible and in a simple computational fashion, the two requirements which are usually not satisfied practically by the methods developed so far. Moreover, the presence of random inaccuracies are usually not taken into account. In view of these three issues, an alternative stochastic approximation approach discussed in the paper, seems to be very promising.
Resumo:
We develop a general model to price VIX futures contracts. The model is adapted to test both the constant elasticity of variance (CEV) and the Cox–Ingersoll–Ross formulations, with and without jumps. Empirical tests on VIX futures prices provide out-of-sample estimates within 2% of the actual futures price for almost all futures maturities. We show that although jumps are present in the data, the models with jumps do not typically outperform the others; in particular, we demonstrate the important benefits of the CEV feature in pricing futures contracts. We conclude by examining errors in the model relative to the VIX characteristics
Resumo:
This article examines the characteristics of key measures of volatility for different types of futures contracts to provide a better foundation for modeling volatility behavior and derivative values. Particular attention is focused on analyzing how different measures of volatility affect volatility persistence relationships. Intraday realized measures of volatility are found to be more persistent than daily measures, the type of GARCH procedure used for conditional volatility analysis is critical, and realized volatility persistence is not coherent with conditional volatility persistence. Specifically, although there is a good fit between the realized and conditional volatilities, no coherence exists between their degrees of persistence, a counterintuitive finding that shows realized and conditional volatility measures are not a substitute for one another
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.
Resumo:
This paper investigates the properties of implied volatility series calculated from options on Treasury bond futures, traded on LIFFE. We demonstrate that the use of near-maturity at the money options to calculate implied volatilities causes less mis-pricing and is therefore superior to, a weighted average measure encompassing all relevant options. We demonstrate that, whilst a set of macroeconomic variables has some predictive power for implied volatilities, we are not able to earn excess returns by trading on the basis of these predictions once we allow for typical investor transactions costs.
Resumo:
Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.
Resumo:
An approach to incorporate spatial dependence into stochastic frontier analysis is developed and applied to a sample of 215 dairy farms in England and Wales. A number of alternative specifications for the spatial weight matrix are used to analyse the effect of these on the estimation of spatial dependence. Estimation is conducted using a Bayesian approach and results indicate that spatial dependence is present when explaining technical inefficiency.
Resumo:
Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.
Resumo:
In 2005, the ECMWF held a workshop on stochastic parameterisation, at which the convection was seen as being a key issue. That much is clear from the working group reports and particularly the statement from working group 1 that “it is clear that a stochastic convection scheme is desirable”. The present note aims to consider our current status in comparison with some of the issues raised and hopes expressed in that working group report.
Resumo:
This paper investigates the degree of return volatility persistence and the time-varying behaviour of systematic risk (beta) for 31 market segments in the UK real estate market. The findings suggest that different property types exhibit differences in volatility persistence and time variability. There is also evidence that the volatility persistence of each market segment and its systematic risk are significantly positively related. Thus, the systematic risks of different property types tend to move in different directions during periods of increased market volatility. Finally, the market segments with systematic risks less than one tend to show negative time variability, while market segments with systematic risk greater than one generally show positive time variability, indicating a positive relationship between the volatility of the market and the systematic risk of individual market segments. Consequently safer and riskier market segments are affected differently by increases in market volatility.
Resumo:
Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.