39 resultados para Forecasting Volatility
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper evaluates the forecasting performance of a continuous stochastic volatility model with two factors of volatility (SV2F) and compares it to those of GARCH and ARFIMA models. The empirical results show that the volatility forecasting ability of the SV2F model is better than that of the GARCH and ARFIMA models, especially when volatility seems to change pattern. We use ex-post volatility as a proxy of the realized volatility obtained from intraday data and the forecasts from the SV2F are calculated using the reprojection technique proposed by Gallant and Tauchen (1998).
Resumo:
The increasing interest aroused by more advanced forecasting techniques, together with the requirement for more accurate forecasts of tourismdemand at the destination level due to the constant growth of world tourism, has lead us to evaluate the forecasting performance of neural modelling relative to that of time seriesmethods at a regional level. Seasonality and volatility are important features of tourism data, which makes it a particularly favourable context in which to compare the forecasting performance of linear models to that of nonlinear alternative approaches. Pre-processed official statistical data of overnight stays and tourist arrivals fromall the different countries of origin to Catalonia from 2001 to 2009 is used in the study. When comparing the forecasting accuracy of the different techniques for different time horizons, autoregressive integrated moving average models outperform self-exciting threshold autoregressions and artificial neural network models, especially for shorter horizons. These results suggest that the there is a trade-off between the degree of pre-processing and the accuracy of the forecasts obtained with neural networks, which are more suitable in the presence of nonlinearity in the data. In spite of the significant differences between countries, which can be explained by different patterns of consumer behaviour,we also find that forecasts of tourist arrivals aremore accurate than forecasts of overnight stays.
Resumo:
This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.
Resumo:
We give sufficient conditions for existence, uniqueness and ergodicity of invariant measures for Musiela's stochastic partial differential equation with deterministic volatility and a Hilbert space valued driving Lévy noise. Conditions for the absence of arbitrage and for the existence of mild solutions are also discussed.
Resumo:
Introducing bounded rationality in a standard consumption-based asset pricing model with time separable preferences strongly improves empirical performance. Learning causes momentum and mean reversion of returns and thereby excess volatility, persistence of price-dividend ratios, long-horizon return predictability and a risk premium, as in the habit model of Campbell and Cochrane (1999), but for lower risk aversion. This is obtained, even though our learning scheme introduces just one free parameter and we only consider learning schemes that imply small deviations from full rationality. The findings are robust to the learning rule used and other model features. What is key is that agents forecast future stock prices using past information on prices.
Resumo:
This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.
Resumo:
Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult toachieve because the relative values of the forecast components often fail to behave ina way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It hasbeen shown that cause-specic mortality forecasts are pessimistic when compared withall-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approachof using log mortality rates and forecasts the density of deaths in the life table. Sincethese values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbingstate), they are intrinsically relative rather than absolute values across decrements aswell as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison(1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that theunit sum constraint is honoured. The structure of the best-known, single-decrementmortality-rate forecasting model, devised by Lee and Carter (1992), is expressed incompositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortalityby cause of death for Japan
Resumo:
I use a multi-layer feedforward perceptron, with backpropagation learning implemented via stochastic gradient descent, to extrapolate the volatility smile of Euribor derivatives over low-strikes by training the network on parametric prices.
Resumo:
Using a suitable Hull and White type formula we develop a methodology to obtain asecond order approximation to the implied volatility for very short maturities. Using thisapproximation we accurately calibrate the full set of parameters of the Heston model. Oneof the reasons that makes our calibration for short maturities so accurate is that we alsotake into account the term-structure for large maturities. We may say that calibration isnot "memoryless", in the sense that the option's behavior far away from maturity doesinfluence calibration when the option gets close to expiration. Our results provide a wayto perform a quick calibration of a closed-form approximation to vanilla options that canthen be used to price exotic derivatives. The methodology is simple, accurate, fast, andit requires a minimal computational cost.
Resumo:
Among the underlying assumptions of the Black-Scholes option pricingmodel, those of a fixed volatility of the underlying asset and of aconstantshort-term riskless interest rate, cause the largest empirical biases. Onlyrecently has attention been paid to the simultaneous effects of thestochasticnature of both variables on the pricing of options. This paper has tried toestimate the effects of a stochastic volatility and a stochastic interestrate inthe Spanish option market. A discrete approach was used. Symmetricand asymmetricGARCH models were tried. The presence of in-the-mean and seasonalityeffectswas allowed. The stochastic processes of the MIBOR90, a Spanishshort-terminterest rate, from March 19, 1990 to May 31, 1994 and of the volatilityofthe returns of the most important Spanish stock index (IBEX-35) fromOctober1, 1987 to January 20, 1994, were estimated. These estimators wereused onpricing Call options on the stock index, from November 30, 1993 to May30, 1994.Hull-White and Amin-Ng pricing formulas were used. These prices werecomparedwith actual prices and with those derived from the Black-Scholesformula,trying to detect the biases reported previously in the literature. Whereasthe conditional variance of the MIBOR90 interest rate seemed to be freeofARCH effects, an asymmetric GARCH with in-the-mean and seasonalityeffectsand some evidence of persistence in variance (IEGARCH(1,2)-M-S) wasfoundto be the model that best represent the behavior of the stochasticvolatilityof the IBEX-35 stock returns. All the biases reported previously in theliterature were found. All the formulas overpriced the options inNear-the-Moneycase and underpriced the options otherwise. Furthermore, in most optiontrading, Black-Scholes overpriced the options and, because of thetime-to-maturityeffect, implied volatility computed from the Black-Scholes formula,underestimatedthe actual volatility.
Resumo:
What determined the volatility of asset prices in Germany between thewars? This paper argues that the influence of political factors has beenoverstated. The majority of events increasing political uncertainty hadlittle or no effect on the value of German assets and the volatility ofreturns on them. Instead, it was inflation (and the fear of it) that islargely responsible for most of the variability in asset returns.
Resumo:
We see that the price of an european call option in a stochastic volatilityframework can be decomposed in the sum of four terms, which identifythe main features of the market that affect to option prices: the expectedfuture volatility, the correlation between the volatility and the noisedriving the stock prices, the market price of volatility risk and thedifference of the expected future volatility at different times. We alsostudy some applications of this decomposition.
Resumo:
This paper presents a two-factor (Vasicek-CIR) model of the term structure of interest rates and develops its pricing and empirical properties. We assume that default free discount bond prices are determined by the time to maturity and two factors, the long-term interest rate and the spread. Assuming a certain process for both factors, a general bond pricing equation is derived and a closed-form expression for bond prices is obtained. Empirical evidence of the model's performance in comparisson with a double Vasicek model is presented. The main conclusion is that the modeling of the volatility in the long-term rate process can help (in a large amount) to fit the observed data can improve - in a reasonable quantity - the prediction of the future movements in the medium- and long-term interest rates. However, for shorter maturities, it is shown that the pricing errors are, basically, negligible and it is not so clear which is the best model to be used.