874 resultados para Forecasting Volatility


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper performs a thorough statistical examination of the time-series properties of the daily market volatility index (VIX) from the Chicago Board Options Exchange (CBOE). The motivation lies not only on the widespread consensus that the VIX is a barometer of the overall market sentiment as to what concerns investors' risk appetite, but also on the fact that there are many trading strategies that rely on the VIX index for hedging and speculative purposes. Preliminary analysis suggests that the VIX index displays long-range dependence. This is well in line with the strong empirical evidence in the literature supporting long memory in both options-implied and realized variances. We thus resort to both parametric and semiparametric heterogeneous autoregressive (HAR) processes for modeling and forecasting purposes. Our main ndings are as follows. First, we con rm the evidence in the literature that there is a negative relationship between the VIX index and the S&P 500 index return as well as a positive contemporaneous link with the volume of the S&P 500 index. Second, the term spread has a slightly negative long-run impact in the VIX index, when possible multicollinearity and endogeneity are controlled for. Finally, we cannot reject the linearity of the above relationships, neither in sample nor out of sample. As for the latter, we actually show that it is pretty hard to beat the pure HAR process because of the very persistent nature of the VIX index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The liberalization of electricity markets more than ten years ago in the vast majority of developed countries has introduced the need of modelling and forecasting electricity prices and volatilities, both in the short and long term. Thus, there is a need of providing methodology that is able to deal with the most important features of electricity price series, which are well known for presenting not only structure in conditional mean but also time-varying conditional variances. In this work we propose a new model, which allows to extract conditionally heteroskedastic common factors from the vector of electricity prices. These common factors are jointly estimated as well as their relationship with the original vector of series, and the dynamics affecting both their conditional mean and variance. The estimation of the model is carried out under the state-space formulation. The new model proposed is applied to extract seasonal common dynamic factors as well as common volatility factors for electricity prices and the estimation results are used to forecast electricity prices and their volatilities in the Spanish zone of the Iberian Market. Several simplified/alternative models are also considered as benchmarks to illustrate that the proposed approach is superior to all of them in terms of explanatory and predictive power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current uncertain context that affects both the world economy and the energy sector, with the rapid increase in the prices of oil and gas and the very unstable political situation that affects some of the largest raw materials’ producers, there is a need for developing efficient and powerful quantitative tools that allow to model and forecast fossil fuel prices, CO2 emission allowances prices as well as electricity prices. This will improve decision making for all the agents involved in energy issues. Although there are papers focused on modelling fossil fuel prices, CO2 prices and electricity prices, the literature is scarce on attempts to consider all of them together. This paper focuses on both building a multivariate model for the aforementioned prices and comparing its results with those of univariate ones, in terms of prediction accuracy (univariate and multivariate models are compared for a large span of days, all in the first 4 months in 2011) as well as extracting common features in the volatilities of the prices of all these relevant magnitudes. The common features in volatility are extracted by means of a conditionally heteroskedastic dynamic factor model which allows to solve the curse of dimensionality problem that commonly arises when estimating multivariate GARCH models. Additionally, the common volatility factors obtained are useful for improving the forecasting intervals and have a nice economical interpretation. Besides, the results obtained and methodology proposed can be useful as a starting point for risk management or portfolio optimization under uncertainty in the current context of energy markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that one of the obstacles to effective forecasting of exchange rates is heteroscedasticity (non-stationary conditional variance). The autoregressive conditional heteroscedastic (ARCH) model and its variants have been used to estimate a time dependent variance for many financial time series. However, such models are essentially linear in form and we can ask whether a non-linear model for variance can improve results just as non-linear models (such as neural networks) for the mean have done. In this paper we consider two neural network models for variance estimation. Mixture Density Networks (Bishop 1994, Nix and Weigend 1994) combine a Multi-Layer Perceptron (MLP) and a mixture model to estimate the conditional data density. They are trained using a maximum likelihood approach. However, it is known that maximum likelihood estimates are biased and lead to a systematic under-estimate of variance. More recently, a Bayesian approach to parameter estimation has been developed (Bishop and Qazaz 1996) that shows promise in removing the maximum likelihood bias. However, up to now, this model has not been used for time series prediction. Here we compare these algorithms with two other models to provide benchmark results: a linear model (from the ARIMA family), and a conventional neural network trained with a sum-of-squares error function (which estimates the conditional mean of the time series with a constant variance noise model). This comparison is carried out on daily exchange rate data for five currencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the monthly volatility of the US economy from 1968 to 2006 by extending the coincidentindex model of Stock and Watson (1991). Our volatility index, which we call VOLINX, hasfour applications. First, it sheds light on the Great Moderation. VOLINX captures the decrease in thevolatility in the mid-80s as well as the different episodes of stress over the sample period. In the 70sand early 80s the stagflation and the two oil crises marked the pace of the volatility whereas 09/11 is themost relevant shock after the moderation. Second, it helps to understand the economic indicators thatcause volatility. While the main determinant of the coincident index is industrial production, VOLINXis mainly affected by employment and income. Third, it adapts the confidence bands of the forecasts.In and out-of-sample evaluations show that the confidence bands may differ up to 50% with respect to amodel with constant variance. Last, the methodology we use permits us to estimate monthly GDP, whichhas conditional volatility that is partly explained by VOLINX. These applications can be used by policymakers for monitoring and surveillance of the stress of the economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ph.D. in the Faculty of Business Administration

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A thermodynamic approach to predict bulk glass-forming compositions in binary metallic systems was recently proposed. In this approach. the parameter gamma* = Delta H-amor/(Delta H-inter - Delta H-amor) indicates the glass-forming ability (GFA) from the standpoint of the driving force to form different competing phases, and Delta H-amor and Delta H-inter are the enthalpies for-lass and intermetallic formation, respectively. Good glass-forming compositions should have a large negative enthalpy for glass formation and a very small difference for intermetallic formation, thus making the glassy phase easily reachable even under low cooling rates. The gamma* parameter showed a good correlation with GFA experimental data in the Ni-Nb binary system. In this work, a simple extension of the gamma* parameter is applied in the ternary Al-Ni-Y system. The calculated gamma* isocontours in the ternary diagram are compared with experimental results of glass formation in that system. Despite sonic misfitting, the best glass formers are found quite close to the highest gamma* values, leading to the conclusion that this thermodynamic approach can lie extended to ternary systems, serving as a useful tool for the development of new glass-forming compositions. Finally the thermodynamic approach is compared with the topological instability criteria used to predict the thermal behavior of glassy Al alloys. (C) 2007 Elsevier B. V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a comparative analysis of the long-term electric power forecasting methodologies used in some South American countries, is presented. The purpose of this study is to compare and observe if such methodologies have some similarities, and also examine the behavior of the results when they are applied to the Brazilian electric market. The abovementioned power forecasts were performed regarding the main four consumption classes (residential, industrial, commercial and rural) which are responsible for approximately 90% of the national consumption. The tool used in this analysis was the SAS (c) program. The outcome of this study allowed identifying various methodological similarities, mainly those related to the econometric variables used by these methods. This fact strongly conditioned the comparative results obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are several ways to attempt to model a building and its heat gains from external sources as well as internal ones in order to evaluate a proper operation, audit retrofit actions, and forecast energy consumption. Different techniques, varying from simple regression to models that are based on physical principles, can be used for simulation. A frequent hypothesis for all these models is that the input variables should be based on realistic data when they are available, otherwise the evaluation of energy consumption might be highly under or over estimated. In this paper, a comparison is made between a simple model based on artificial neural network (ANN) and a model that is based on physical principles (EnergyPlus) as an auditing and predicting tool in order to forecast building energy consumption. The Administration Building of the University of Sao Paulo is used as a case study. The building energy consumption profiles are collected as well as the campus meteorological data. Results show that both models are suitable for energy consumption forecast. Additionally, a parametric analysis is carried out for the considered building on EnergyPlus in order to evaluate the influence of several parameters such as the building profile occupation and weather data on such forecasting. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analysed scenarios for Brazilian consumption of ethanol for the period 2006 to 2012. The results show that if the country`s GDP sustains a 4.6% a year growth, domestic consumption of fuel ethanol could increase to 25.16 billion liters in this period, which is a volume relatively close to the forecasted gasoline consumption of 31 billion liters. At a lower GDP growth of 1.22% a year, gasoline consumption would be reduced and domestic ethanol consumption in Brazil would be no higher than 18.32 billion liters. Contrary to the current situation, forecasts indicated that hydrated ethanol consumption could become much higher than anhydrous consumption in Brazil. The former is being consumed in cars moved exclusively by ethanol and flex-fuel cars, successfully introduced in the country at 2003. Flex cars allow Brazilian consumers to choose between gasoline and hydrated ethanol and immediately switch to whichever fuel presents the most favourable relative price.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses an object-oriented neural network model that was developed for predicting short-term traffic conditions on a section of the Pacific Highway between Brisbane and the Gold Coast in Queensland, Australia. The feasibility of this approach is demonstrated through a time-lag recurrent network (TLRN) which was developed for predicting speed data up to 15 minutes into the future. The results obtained indicate that the TLRN is capable of predicting speed up to 5 minutes into the future with a high degree of accuracy (90-94%). Similar models, which were developed for predicting freeway travel times on the same facility, were successful in predicting travel times up to 15 minutes into the future with a similar degree of accuracy (93-95%). These results represent substantial improvements on conventional model performance and clearly demonstrate the feasibility of using the object-oriented approach for short-term traffic prediction. (C) 2001 Elsevier Science B.V. All rights reserved.