135 resultados para Forecast accuracy
em CentAUR: Central Archive University of Reading - UK
Resumo:
We compare and contrast the accuracy and uncertainty in forecasts of rents with those for a variety of macroeconomic series. The results show that in general forecasters tend to be marginally more accurate in the case of macro-economic series than with rents. In common across all of the series, forecasts tend to be smoothed with forecasters under-estimating performance during economic booms, and vice-versa in recessions We find that property forecasts are affected by economic uncertainty, as measured by disagreement across the macro-forecasters. Increased uncertainty leads to increased dispersion in the rental forecasts and a reduction in forecast accuracy.
Resumo:
Survey respondents who make point predictions and histogram forecasts of macro-variables reveal both how uncertain they believe the future to be, ex ante, as well as their ex post performance. Macroeconomic forecasters tend to be overconfident at horizons of a year or more, but overestimate (i.e., are underconfident regarding) the uncertainty surrounding their predictions at short horizons. Ex ante uncertainty remains at a high level compared to the ex post measure as the forecast horizon shortens. There is little evidence of a link between individuals’ ex post forecast accuracy and their ex ante subjective assessments.
Resumo:
We consider the forecasting performance of two SETAR exchange rate models proposed by Kräger and Kugler [J. Int. Money Fin. 12 (1993) 195]. Assuming that the models are good approximations to the data generating process, we show that whether the non-linearities inherent in the data can be exploited to forecast better than a random walk depends on both how forecast accuracy is assessed and on the ‘state of nature’. Evaluation based on traditional measures, such as (root) mean squared forecast errors, may mask the superiority of the non-linear models. Generalized impulse response functions are also calculated as a means of portraying the asymmetric response to shocks implied by such models.
Resumo:
This paper uses appropriately modified information criteria to select models from the GARCH family, which are subsequently used for predicting US dollar exchange rate return volatility. The out of sample forecast accuracy of models chosen in this manner compares favourably on mean absolute error grounds, although less favourably on mean squared error grounds, with those generated by the commonly used GARCH(1, 1) model. An examination of the orders of models selected by the criteria reveals that (1, 1) models are typically selected less than 20% of the time.
Resumo:
Factor forecasting models are shown to deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.
Resumo:
This paper characterizes the dynamics of jumps and analyzes their importance for volatility forecasting. Using high-frequency data on four prominent energy markets, we perform a model-free decomposition of realized variance into its continuous and discontinuous components. We find strong evidence of jumps in energy markets between 2007 and 2012. We then investigate the importance of jumps for volatility forecasting. To this end, we estimate and analyze the predictive ability of several Heterogenous Autoregressive (HAR) models that explicitly capture the dynamics of jumps. Conducting extensive in-sample and out-of-sample analyses, we establish that explicitly modeling jumps does not significantly improve forecast accuracy. Our results are broadly consistent across our four energy markets, forecasting horizons, and loss functions
Resumo:
Ocean prediction systems are now able to analyse and predict temperature, salinity and velocity structures within the ocean by assimilating measurements of the ocean’s temperature and salinity into physically based ocean models. Data assimilation combines current estimates of state variables, such as temperature and salinity, from a computational model with measurements of the ocean and atmosphere in order to improve forecasts and reduce uncertainty in the forecast accuracy. Data assimilation generally works well with ocean models away from the equator but has been found to induce vigorous and unrealistic overturning circulations near the equator. A pressure correction method was developed at the University of Reading and the Met Office to control these circulations using ideas from control theory and an understanding of equatorial dynamics. The method has been used for the last 10 years in seasonal forecasting and ocean prediction systems at the Met Office and European Center for Medium-range Weather Forecasting (ECMWF). It has been an important element in recent re-analyses of the ocean heat uptake that mitigates climate change.
Resumo:
In this paper the meteorological processes responsible for transporting tracer during the second ETEX (European Tracer EXperiment) release are determined using the UK Met Office Unified Model (UM). The UM predicted distribution of tracer is also compared with observations from the ETEX campaign. The dominant meteorological process is a warm conveyor belt which transports large amounts of tracer away from the surface up to a height of 4 km over a 36 h period. Convection is also an important process, transporting tracer to heights of up to 8 km. Potential sources of error when using an operational numerical weather prediction model to forecast air quality are also investigated. These potential sources of error include model dynamics, model resolution and model physics. In the UM a semi-Lagrangian monotonic advection scheme is used with cubic polynomial interpolation. This can predict unrealistic negative values of tracer which are subsequently set to zero, and hence results in an overprediction of tracer concentrations. In order to conserve mass in the UM tracer simulations it was necessary to include a flux corrected transport method. Model resolution can also affect the accuracy of predicted tracer distributions. Low resolution simulations (50 km grid length) were unable to resolve a change in wind direction observed during ETEX 2, this led to an error in the transport direction and hence an error in tracer distribution. High resolution simulations (12 km grid length) captured the change in wind direction and hence produced a tracer distribution that compared better with the observations. The representation of convective mixing was found to have a large effect on the vertical transport of tracer. Turning off the convective mixing parameterisation in the UM significantly reduced the vertical transport of tracer. Finally, air quality forecasts were found to be sensitive to the timing of synoptic scale features. Errors in the position of the cold front relative to the tracer release location of only 1 h resulted in changes in the predicted tracer concentrations that were of the same order of magnitude as the absolute tracer concentrations.
Resumo:
In a recent study, Williams introduced a simple modification to the widely used Robert–Asselin (RA) filter for numerical integration. The main purpose of the Robert–Asselin–Williams (RAW) filter is to avoid the undesired numerical damping of the RA filter and to increase the accuracy. In the present paper, the effects of the modification are comprehensively evaluated in the Simplified Parameterizations, Primitive Equation Dynamics (SPEEDY) atmospheric general circulation model. First, the authors search for significant changes in the monthly climatology due to the introduction of the new filter. After testing both at the local level and at the field level, no significant changes are found, which is advantageous in the sense that the new scheme does not require a retuning of the parameterized model physics. Second, the authors examine whether the new filter improves the skill of short- and medium-term forecasts. January 1982 data from the NCEP–NCAR reanalysis are used to evaluate the forecast skill. Improvements are found in all the model variables (except the relative humidity, which is hardly changed). The improvements increase with lead time and are especially evident in medium-range forecasts (96–144 h). For example, in tropical surface pressure predictions, 5-day forecasts made using the RAW filter have approximately the same skill as 4-day forecasts made using the RA filter. The results of this work are encouraging for the implementation of the RAW filter in other models currently using the RA filter.
Resumo:
This paper proposes and tests a new framework for weighting recursive out-of-sample prediction errors according to their corresponding levels of in-sample estimation uncertainty. In essence, we show how to use the maximum possible amount of information from the sample in the evaluation of the prediction accuracy, by commencing the forecasts at the earliest opportunity and weighting the prediction errors. Via a Monte Carlo study, we demonstrate that the proposed framework selects the correct model from a set of candidate models considerably more often than the existing standard approach when only a small sample is available. We also show that the proposed weighting approaches result in tests of equal predictive accuracy that have much better sizes than the standard approach. An application to an exchange rate dataset highlights relevant differences in the results of tests of predictive accuracy based on the standard approach versus the framework proposed in this paper.
Resumo:
Existing empirical evidence has frequently observed that professional forecasters are conservative and display herding behaviour. Whilst a large number of papers have considered equities as well as macroeconomic series, few have considered the accuracy of forecasts in alternative asset classes such as real estate. We consider the accuracy of forecasts for the UK commercial real estate market over the period 1999-2011. The results illustrate that forecasters display a tendency to under-estimate growth rates during strong market conditions and over-estimate when the market is performing poorly. This conservatism not only results in smoothed estimates but also implies that forecasters display herding behaviour. There is also a marked difference in the relative accuracy of capital and total returns versus rental figures. Whilst rental growth forecasts are relatively accurate, considerable inaccuracy is observed with respect to capital value and total returns.
Resumo:
The impact of selected observing systems on forecast skill is explored using the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40) system. Analyses have been produced for a surface-based observing system typical of the period prior to 1945/1950, a terrestrial-based observing system typical of the period 1950-1979 and a satellite-based observing system consisting of surface pressure and satellite observations. Global prediction experiments have been undertaken using these analyses as initial states, and which are available every 6 h, for the boreal winters of 1990/1991 and 2000/2001 and the summer of 2000, using a more recent version of the ECMWF model. The results show that for 500-hPa geopotential height, as a representative field, the terrestrial system in the Northern Hemisphere extratropics is only slightly inferior to the control system, which makes use of all observations for the analysis, and is also more accurate than the satellite system. There are indications that the skill of the terrestrial system worsens slightly and the satellite system improves somewhat between 1990/1991 and 2000/2001. The forecast skill in the Southern Hemisphere is dominated by the satellite information and this dominance is larger in the latter period. The overall skill is only slightly worse than that of the Northern Hemisphere. In the tropics (20 degrees S-20 degrees N), using the wind at 850 and 250 hPa as representative fields, the information content in the terrestrial and satellite systems is almost equal and complementary. The surface-based system has very limited skill restricted to the lower troposphere of the Northern Hemisphere. Predictability calculations show a potential for a further increase in predictive skill of 1-2 d in the extratropics of both hemispheres, but a potential for a major improvement of many days in the tropics. As well as the Eulerian perspective of predictability, the storm tracks have been calculated from all experiments and validated for the extratropics to provide a Lagrangian perspective.
Resumo:
A parametrization for ice supersaturation is introduced into the ECMWF Integrated Forecast System (IFS), compatible with the cloud scheme that allows partial cloud coverage. It is based on the simple, but often justifiable, diagnostic assumption that the ice nucleation and subsequent depositional growth time-scales are short compared to the model time step, thus supersaturation is only permitted in the clear-sky portion of the grid cell. Results from model integrations using the new scheme are presented, which is demonstrated to increase upper-tropospheric humidity, decrease high-level cloud cover and, to a much lesser extent, cloud ice amounts, all as expected from simple arguments. Evaluation of the relative distribution of supersaturated humidity amounts shows good agreement with the observed climatology derived from in situ aircraft observations. With the new scheme, the global distribution of frequency of occurrence of supersaturated regions compares well with remotely sensed microwave limb sounder (MLS) data, with the most marked errors of underprediction occurring in regions where the model is known to underpredict deep convection. Finally, it is also demonstrated that the new scheme leads to improved predictions of permanent contrail cloud over southern England, which indirectly implies upper-tropospheric humidity fields are better represented for this region.
Resumo:
Two wavelet-based control variable transform schemes are described and are used to model some important features of forecast error statistics for use in variational data assimilation. The first is a conventional wavelet scheme and the other is an approximation of it. Their ability to capture the position and scale-dependent aspects of covariance structures is tested in a two-dimensional latitude-height context. This is done by comparing the covariance structures implied by the wavelet schemes with those found from the explicit forecast error covariance matrix, and with a non-wavelet- based covariance scheme used currently in an operational assimilation scheme. Qualitatively, the wavelet-based schemes show potential at modeling forecast error statistics well without giving preference to either position or scale-dependent aspects. The degree of spectral representation can be controlled by changing the number of spectral bands in the schemes, and the least number of bands that achieves adequate results is found for the model domain used. Evidence is found of a trade-off between the localization of features in positional and spectral spaces when the number of bands is changed. By examining implied covariance diagnostics, the wavelet-based schemes are found, on the whole, to give results that are closer to diagnostics found from the explicit matrix than from the nonwavelet scheme. Even though the nature of the covariances has the right qualities in spectral space, variances are found to be too low at some wavenumbers and vertical correlation length scales are found to be too long at most scales. The wavelet schemes are found to be good at resolving variations in position and scale-dependent horizontal length scales, although the length scales reproduced are usually too short. The second of the wavelet-based schemes is often found to be better than the first in some important respects, but, unlike the first, it has no exact inverse transform.