874 resultados para Forecasting Volatility
Resumo:
We uncover high persistence in credit spread series that can obscure the relationship between the theoretical determinants of credit risk and observed credit spreads. We use a Markovswitching model, which also captures the stability (low frequency changes) of credit ratings, to show why credit spreads may continue to respond to past levels of credit risk, even though the state of the economy has changed. A bivariate model of credit spreads and either macroeconomic activity or equity market volatility detects large and significant correlations that are consistent with theory but have not been observed in previous studies. © 2010 Nova Science Publishers, Inc. All rights reserved.
Resumo:
This paper compares the experience of forecasting the UK government bond yield curve before and after the dramatic lowering of short-term interest rates from October 2008. Out-of-sample forecasts for 1, 6 and 12 months are generated from each of a dynamic Nelson-Siegel model, autoregressive models for both yields and the principal components extracted from those yields, a slope regression and a random walk model. At short forecasting horizons, there is little difference in the performance of the models both prior to and after 2008. However, for medium- to longer-term horizons, the slope regression provided the best forecasts prior to 2008, while the recent experience of near-zero short interest rates coincides with a period of forecasting superiority for the autoregressive and dynamic Nelson-Siegel models. © 2014 John Wiley & Sons, Ltd.
Resumo:
This empirical study examines the Pricing-To-Market (PTM) behaviour of 20 UK export sectors. Using both Exponential General Autoregressive Conditional Heteroscedasticity (EGARCH) and Threshold GARCH (TGARCH) estimation methods, we find evidence of PTM that is accompanied by strong conditional volatility and weak asymmetry effects. The PTM estimates suggest that when the currency of exporters appreciates in the current period, exporters pass-on between 31% and 94% of the Foreign Exchange (FX) rate increase to importers. However, both export price changes and producers' prices are sluggish, perhaps being driven by coordination failure and menu driven costs, amongst others. Furthermore, export prices contain strong time varying effects which impact on PTM strategy. Exporters do not typically appear to put much more weight on negative news of (say) an FX rate appreciation compared to positive news of an FX rate depreciation. Much depends on the export sector. © 2010 Taylor & Francis.
Resumo:
This study examines the information content of alternative implied volatility measures for the 30 components of the Dow Jones Industrial Average Index from 1996 until 2007. Along with the popular Black-Scholes and \model-free" implied volatility expectations, the recently proposed corridor implied volatil- ity (CIV) measures are explored. For all pair-wise comparisons, it is found that a CIV measure that is closely related to the model-free implied volatility, nearly always delivers the most accurate forecasts for the majority of the firms. This finding remains consistent for different forecast horizons, volatility definitions, loss functions and forecast evaluation settings.
Resumo:
We model the effects of quantitative easing on the volatility of returns to individual gilts, examining both the effects of QE overall and of the specific days of asset purchases. The action of QE successfully neutralized the six fold increase in volatility that had been experienced by gilts since the start of the financial crisis. The volatility of longer term bonds reduced more quickly than the volatility of short to medium term bonds. The reversion of the volatility of shorter term bonds to pre-crisis levels was found to be more sensitive to the specific operational actions of QE, particularly where they experienced relatively greater purchase activity.
Resumo:
The article deals with problems of forecasting of economic macroparameters on the basis of the principle of «subjective multideterminism», i.e. an expert account of maximal amount of interrelated «objective» and «subjective» causes. A description is given of the system of support of decision-making in forecasting the level of inflation and gross domestic product on the basis of the tree solution method.
Resumo:
This article is dedicated to the vital problem of the creation of GIS-systems for the monitoring, prognostication and control of technogenic natural catastrophes. The decrease of risks, the protection of economic objects, averting the human victims, caused by the dynamism of avalanche centers, depends on the effectiveness of the prognostication procedures of avalanche danger used. In the article the structure of a prognostication subsystem information input is developed and the technology for the complex forecast of avalanche-prone situations is proposed.
Resumo:
In this article there are considered problems of forecasting economical macroparameters, and in the first place, index of inflation. Concept of development of synthetical forecasting methods which use directly specified expert information as well as calculation result on the basis of objective economical and mathematical models for forecasting separate “slowly changeable parameters” are offered. This article discusses problems of macroparameters operation on the basis of analysis of received prognostic magnitude.
Resumo:
2002 Mathematics Subject Classification: 62M20, 62-07, 62J05, 62P20.
Resumo:
2000 Mathematics Subject Classification: 62M20, 62M10, 62-07.
Resumo:
Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.
Resumo:
2000 Mathematics Subject Classification: 65M06, 65M12.
Resumo:
Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.