915 resultados para Forecasting Volatility


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successfully determining competitive optimal schedules for electricity generation intimately hinges on the forecasts of loads. The nonstationarity and high volatility of loads make their accurate prediction somewhat problematic. Presence of uncertainty in data also significantly degrades accuracy of point predictions produced by deterministic load forecasting models. Therefore, operation planning utilizing these predictions will be unreliable. This paper aims at developing prediction intervals rather than producing exact point prediction. Prediction intervals are theatrically more reliable and practical than predicted values. The delta and Bayesian techniques for constructing prediction intervals for forecasted loads are implemented here. To objectively and comprehensively assess quality of constructed prediction intervals, a new index based on length and coverage probability of prediction intervals is developed. In experiments with real data, and through calculation of global statistics, it is shown that neural network point prediction performance is unreliable. In contrast, prediction intervals developed using the delta and Bayesian techniques are satisfactorily narrow, with a high coverage probability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the price volatility interaction between the crude oil and equity markets in the US using 5-min data over the period 2009-2012. Our main findings can be summarised as follows. First, we find strong evidence to demonstrate that the integration of the bid-ask spread and trading volume factors leads to a better performance in predicting price volatility. Second, trading information, such as bid-ask spread, trading volume, and the price volatility from cross-markets, improves the price volatility predictability for both in-sample and out-of-sample analyses. Third, the trading strategy based on the predictive regression model that includes trading information from both markets provides significant utility gains to mean-variance investors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper performs a thorough statistical examination of the time-series properties of the daily market volatility index (VIX) from the Chicago Board Options Exchange (CBOE). The motivation lies not only on the widespread consensus that the VIX is a barometer of the overall market sentiment as to what concerns investors' risk appetite, but also on the fact that there are many trading strategies that rely on the VIX index for hedging and speculative purposes. Preliminary analysis suggests that the VIX index displays long-range dependence. This is well in line with the strong empirical evidence in the literature supporting long memory in both options-implied and realized variances. We thus resort to both parametric and semiparametric heterogeneous autoregressive (HAR) processes for modeling and forecasting purposes. Our main ndings are as follows. First, we con rm the evidence in the literature that there is a negative relationship between the VIX index and the S&P 500 index return as well as a positive contemporaneous link with the volume of the S&P 500 index. Second, the term spread has a slightly negative long-run impact in the VIX index, when possible multicollinearity and endogeneity are controlled for. Finally, we cannot reject the linearity of the above relationships, neither in sample nor out of sample. As for the latter, we actually show that it is pretty hard to beat the pure HAR process because of the very persistent nature of the VIX index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The liberalization of electricity markets more than ten years ago in the vast majority of developed countries has introduced the need of modelling and forecasting electricity prices and volatilities, both in the short and long term. Thus, there is a need of providing methodology that is able to deal with the most important features of electricity price series, which are well known for presenting not only structure in conditional mean but also time-varying conditional variances. In this work we propose a new model, which allows to extract conditionally heteroskedastic common factors from the vector of electricity prices. These common factors are jointly estimated as well as their relationship with the original vector of series, and the dynamics affecting both their conditional mean and variance. The estimation of the model is carried out under the state-space formulation. The new model proposed is applied to extract seasonal common dynamic factors as well as common volatility factors for electricity prices and the estimation results are used to forecast electricity prices and their volatilities in the Spanish zone of the Iberian Market. Several simplified/alternative models are also considered as benchmarks to illustrate that the proposed approach is superior to all of them in terms of explanatory and predictive power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current uncertain context that affects both the world economy and the energy sector, with the rapid increase in the prices of oil and gas and the very unstable political situation that affects some of the largest raw materials’ producers, there is a need for developing efficient and powerful quantitative tools that allow to model and forecast fossil fuel prices, CO2 emission allowances prices as well as electricity prices. This will improve decision making for all the agents involved in energy issues. Although there are papers focused on modelling fossil fuel prices, CO2 prices and electricity prices, the literature is scarce on attempts to consider all of them together. This paper focuses on both building a multivariate model for the aforementioned prices and comparing its results with those of univariate ones, in terms of prediction accuracy (univariate and multivariate models are compared for a large span of days, all in the first 4 months in 2011) as well as extracting common features in the volatilities of the prices of all these relevant magnitudes. The common features in volatility are extracted by means of a conditionally heteroskedastic dynamic factor model which allows to solve the curse of dimensionality problem that commonly arises when estimating multivariate GARCH models. Additionally, the common volatility factors obtained are useful for improving the forecasting intervals and have a nice economical interpretation. Besides, the results obtained and methodology proposed can be useful as a starting point for risk management or portfolio optimization under uncertainty in the current context of energy markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that one of the obstacles to effective forecasting of exchange rates is heteroscedasticity (non-stationary conditional variance). The autoregressive conditional heteroscedastic (ARCH) model and its variants have been used to estimate a time dependent variance for many financial time series. However, such models are essentially linear in form and we can ask whether a non-linear model for variance can improve results just as non-linear models (such as neural networks) for the mean have done. In this paper we consider two neural network models for variance estimation. Mixture Density Networks (Bishop 1994, Nix and Weigend 1994) combine a Multi-Layer Perceptron (MLP) and a mixture model to estimate the conditional data density. They are trained using a maximum likelihood approach. However, it is known that maximum likelihood estimates are biased and lead to a systematic under-estimate of variance. More recently, a Bayesian approach to parameter estimation has been developed (Bishop and Qazaz 1996) that shows promise in removing the maximum likelihood bias. However, up to now, this model has not been used for time series prediction. Here we compare these algorithms with two other models to provide benchmark results: a linear model (from the ARIMA family), and a conventional neural network trained with a sum-of-squares error function (which estimates the conditional mean of the time series with a constant variance noise model). This comparison is carried out on daily exchange rate data for five currencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper develops a novel realized matrix-exponential stochastic volatility model of multivariate returns and realized covariances that incorporates asymmetry and long memory (hereafter the RMESV-ALM model). The matrix exponential transformation guarantees the positivedefiniteness of the dynamic covariance matrix. The contribution of the paper ties in with Robert Basmann’s seminal work in terms of the estimation of highly non-linear model specifications (“Causality tests and observationally equivalent representations of econometric models”, Journal of Econometrics, 1988, 39(1-2), 69–104), especially for developing tests for leverage and spillover effects in the covariance dynamics. Efficient importance sampling is used to maximize the likelihood function of RMESV-ALM, and the finite sample properties of the quasi-maximum likelihood estimator of the parameters are analysed. Using high frequency data for three US financial assets, the new model is estimated and evaluated. The forecasting performance of the new model is compared with a novel dynamic realized matrix-exponential conditional covariance model. The volatility and co-volatility spillovers are examined via the news impact curves and the impulse response functions from returns to volatility and co-volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the monthly volatility of the US economy from 1968 to 2006 by extending the coincidentindex model of Stock and Watson (1991). Our volatility index, which we call VOLINX, hasfour applications. First, it sheds light on the Great Moderation. VOLINX captures the decrease in thevolatility in the mid-80s as well as the different episodes of stress over the sample period. In the 70sand early 80s the stagflation and the two oil crises marked the pace of the volatility whereas 09/11 is themost relevant shock after the moderation. Second, it helps to understand the economic indicators thatcause volatility. While the main determinant of the coincident index is industrial production, VOLINXis mainly affected by employment and income. Third, it adapts the confidence bands of the forecasts.In and out-of-sample evaluations show that the confidence bands may differ up to 50% with respect to amodel with constant variance. Last, the methodology we use permits us to estimate monthly GDP, whichhas conditional volatility that is partly explained by VOLINX. These applications can be used by policymakers for monitoring and surveillance of the stress of the economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ph.D. in the Faculty of Business Administration

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Queensland Department of Public Works (DPW) holds a significant interest in the Brisbane Central Business District (CBD) in controlling approximately 20 percent of the office space within its confines. This comprises a total of 333,903 square metres of space, of which 170,111 square metres is owned and 163,792 square metres is leased from the private sector. The department’s nominal ownership extends to several enduring, landmark buildings as well as several modern office towers. The portfolio includes the oldest building in the CBD, being the former Commissariat Stores building and one of the newest, a 15,000 square metre office tower under construction at 33 Charlotte Street.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle emissions, volatility, and the concentration of reactive oxygen species (ROS) were investigated for a pre-Euro I compression ignition engine to study the potential health impacts of employing ethanol fumigation technology. Engine testing was performed in two separate experimental campaigns with most testing performed at intermediate speed with four different load settings and various ethanol substitutions. A scanning mobility particle sizer (SMPS) was used to determine particle size distributions, a volatilization tandem differential mobility analyzer (V-TDMA) was used to explore particle volatility, and a new profluorescent nitroxide probe, BPEAnit, was used to investigate the potential toxicity of particles. The greatest particulate mass reduction was achieved with ethanol fumigation at full load, which contributed to the formation of a nucleation mode. Ethanol fumigation increased the volatility of particles by coating the particles with organic material or by making extra organic material available as an external mixture. In addition, the particle-related ROS concentrations increased with ethanol fumigation and were associated with the formation of a nucleation mode. The smaller particles, the increased volatility, and the increase in potential particle toxicity with ethanol fumigation may provide a substantial barrier for the uptake of fumigation technology using ethanol as a supplementary fuel.