889 resultados para Forecast error variance


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli selvittää Suomen osakemarkkinoiden hinnoittamistehokkuutta johdon ennustepoikkeamien tapauksessa. Tarkastelu-ajanjakso on 1.1.2000–31.12.2006. Kiinnostuneita oltiin erityisesti keski-pitkän ja pitkän aikavälin (10 kaupankäyntivuorokautta ja 40 kaupankäyntivuorokautta) epänormaaleista tuotoista. Työn empiriaosuus suoritettiin tapahtumatutkimusmenetelmää hyväksikäyttäen. Empiriaosuudessa laskettiin ennustepoikkeaman aiheuttamat epänormaalit tuotot pienimmän neliösumman regressiota hyväksikäyttäen. Positiivisten poikkeamien aiheuttamat epänormaalit tuotot eivät poikenneet tilastollisesti markkinatuotoista millään tarkasteluvälillä. Negatiivisten epänormaalien tuottojen poikkeamaa havaittiin 10 kaupankäyntivuorokautta ennustepoikkeaman julkaisun jälkeen. Toimialakohtaisessa tarkastelussa parillisten otosten t-testissä havaittiin, että terveydenhuollon, rahoituksen ja informaatioteknologian epänormaalit tuotot poikkesivat markkinatuotoista. Yrityskohtaisia tekijöitä tutkittiin yksinkertaisella t-testillä. Näistä velkaisuusasteella, beetalla, päivätuottojen keskihajonnalla, päivittäisellä liikevaihdolla sekä P/E-luvulla havaittiin voi-tavan ennustaa mahdollisten ennustepoikkeamien esiintymistiheyttä. Sen sijaan markkina-arvolla ja P/B-luvulla ei havaittu yhteyttä ennustepoikkeamien esiintymistiheyteen. Tutkittaessa käyttäytymistieteellisen rahoituksen olettamaa kaupankäyntivolyymin kasvusta ennustepoikkeamaa ympäröivänä aikana ei tilastollista merkittävyyttä löydetty.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Electricity price forecasting has become an important area of research in the aftermath of the worldwide deregulation of the power industry that launched competitive electricity markets now embracing all market participants including generation and retail companies, transmission network providers, and market managers. Based on the needs of the market, a variety of approaches forecasting day-ahead electricity prices have been proposed over the last decades. However, most of the existing approaches are reasonably effective for normal range prices but disregard price spike events, which are caused by a number of complex factors and occur during periods of market stress. In the early research, price spikes were truncated before application of the forecasting model to reduce the influence of such observations on the estimation of the model parameters; otherwise, a very large forecast error would be generated on price spike occasions. Electricity price spikes, however, are significant for energy market participants to stay competitive in a market. Accurate price spike forecasting is important for generation companies to strategically bid into the market and to optimally manage their assets; for retailer companies, since they cannot pass the spikes onto final customers, and finally, for market managers to provide better management and planning for the energy market. This doctoral thesis aims at deriving a methodology able to accurately predict not only the day-ahead electricity prices within the normal range but also the price spikes. The Finnish day-ahead energy market of Nord Pool Spot is selected as the case market, and its structure is studied in detail. It is almost universally agreed in the forecasting literature that no single method is best in every situation. Since the real-world problems are often complex in nature, no single model is able to capture different patterns equally well. Therefore, a hybrid methodology that enhances the modeling capabilities appears to be a possibly productive strategy for practical use when electricity prices are predicted. The price forecasting methodology is proposed through a hybrid model applied to the price forecasting in the Finnish day-ahead energy market. The iterative search procedure employed within the methodology is developed to tune the model parameters and select the optimal input set of the explanatory variables. The numerical studies show that the proposed methodology has more accurate behavior than all other examined methods most recently applied to case studies of energy markets in different countries. The obtained results can be considered as providing extensive and useful information for participants of the day-ahead energy market, who have limited and uncertain information for price prediction to set up an optimal short-term operation portfolio. Although the focus of this work is primarily on the Finnish price area of Nord Pool Spot, given the result of this work, it is very likely that the same methodology will give good results when forecasting the prices on energy markets of other countries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using the method of Lorenz (1982), we have estimated the predictability of a recent version of the European Center for Medium-Range Weather Forecasting (ECMWF) model using two different estimates of the initial error corresponding to 6- and 24-hr forecast errors, respectively. For a 6-hr forecast error of the extratropical 500-hPa geopotential height field, a potential increase in forecast skill by more than 3 d is suggested, indicating a further increase in predictability by another 1.5 d compared to the use of a 24-hr forecast error. This is due to a smaller initial error and to an initial error reduction resulting in a smaller averaged growth rate for the whole 7-d forecast. A similar assessment for the tropics using the wind vector fields at 850 and 250 hPa suggests a huge potential improvement with a 7-d forecast providing the same skill as a 1-d forecast now. A contributing factor to the increase in the estimate of predictability is the apparent slow increase of error during the early part of the forecast.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A regional study of the prediction of extratropical cyclones by the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) has been performed. An objective feature-tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast error statistics have then been produced for the position, intensity, and propagation speed of the storms. In previous work, data limitations meant it was only possible to present the diagnostics for the entire Northern Hemisphere (NH) or Southern Hemisphere. A larger data sample has allowed the diagnostics to be computed separately for smaller regions around the globe and has made it possible to explore the regional differences in the prediction of storms by the EPS. Results show that in the NH there is a larger ensemble mean error in the position of storms over the Atlantic Ocean. Further analysis revealed that this is mainly due to errors in the prediction of storm propagation speed rather than in direction. Forecast storms propagate too slowly in all regions, but the bias is about 2 times as large in the NH Atlantic region. The results show that storm intensity is generally overpredicted over the ocean and underpredicted over the land and that the absolute error in intensity is larger over the ocean than over the land. In the NH, large errors occur in the prediction of the intensity of storms that originate as tropical cyclones but then move into the extratropics. The ensemble is underdispersive for the intensity of cyclones (i.e., the spread is smaller than the mean error) in all regions. The spatial patterns of the ensemble mean error and ensemble spread are very different for the intensity of cyclones. Spatial distributions of the ensemble mean error suggest that large errors occur during the growth phase of storm development, but this is not indicated by the spatial distributions of the ensemble spread. In the NH there are further differences. First, the large errors in the prediction of the intensity of cyclones that originate in the tropics are not indicated by the spread. Second, the ensemble mean error is larger over the Pacific Ocean than over the Atlantic, whereas the opposite is true for the spread. The use of a storm-tracking approach, to both weather forecasters and developers of forecast systems, is also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impact of targeted sonde observations on the 1-3 day forecasts for northern Europe is evaluated using the Met Office four-dimensional variational data assimilation scheme and a 24 km gridlength limited-area version of the Unified Model (MetUM). The targeted observations were carried out during February and March 2007 as part of the Greenland Flow Distortion Experiment, using a research aircraft based in Iceland. Sensitive area predictions using either total energy singular vectors or an ensemble transform Kalman filter were used to predict where additional observations should be made to reduce errors in the initial conditions of forecasts for northern Europe. Targeted sonde data was assimilated operationally into the MetUM. Hindcasts show that the impact of the sondes was mixed. Only two out of the five cases showed clear forecast improvement; the maximum forecast improvement seen over the verifying region was approximately 5% of the forecast error 24 hours into the forecast. These two cases are presented in more detail: in the first the improvement propagates into the verification region with a developing polar low; and in the second the improvement is associated with an upper-level trough. The impact of cycling targeted data in the background of the forecast (including the memory of previous targeted observations) is investigated. This is shown to cause a greater forecast impact, but does not necessarily lead to a greater forecast improvement. Finally, the robustness of the results is assessed using a small ensemble of forecasts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new spectral-based approach is presented to find orthogonal patterns from gridded weather/climate data. The method is based on optimizing the interpolation error variance. The optimally interpolated patterns (OIP) are then given by the eigenvectors of the interpolation error covariance matrix, obtained using the cross-spectral matrix. The formulation of the approach is presented, and the application to low-dimension stochastic toy models and to various reanalyses datasets is performed. In particular, it is found that the lowest-frequency patterns correspond to largest eigenvalues, that is, variances, of the interpolation error matrix. The approach has been applied to the Northern Hemispheric (NH) and tropical sea level pressure (SLP) and to the Indian Ocean sea surface temperature (SST). Two main OIP patterns are found for the NH SLP representing respectively the North Atlantic Oscillation and the North Pacific pattern. The leading tropical SLP OIP represents the Southern Oscillation. For the Indian Ocean SST, the leading OIP pattern shows a tripole-like structure having one sign over the eastern and north- and southwestern parts and an opposite sign in the remaining parts of the basin. The pattern is also found to have a high lagged correlation with the Niño-3 index with 6-months lag.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ECMWF full-physics and dry singular vector (SV) packages, using a dry energy norm and a 1-day optimization time, are applied to four high impact European cyclones of recent years that were almost universally badly forecast in the short range. It is shown that these full-physics SVs are much more relevant to severe cyclonic development than those based on dry dynamics plus boundary layer alone. The crucial extra ingredient is the representation of large-scale latent heat release. The severe winter storms all have a long, nearly straight region of high baroclinicity stretching across the Atlantic towards Europe, with a tongue of very high moisture content on its equatorward flank. In each case some of the final-time top SV structures pick out the region of the actual storm. The initial structures were generally located in the mid- to low troposphere. Forecasts based on initial conditions perturbed by moist SVs with opposite signs and various amplitudes show the range of possible 1-day outcomes for reasonable magnitudes of forecast error. In each case one of the perturbation structures gave a forecast very much closer to the actual storm than the control forecast. Deductions are made about the predictability of high-impact extratropical cyclone events. Implications are drawn for the short-range forecast problem and suggestions made for one practicable way to approach short-range ensemble forecasting. Copyright © 2005 Royal Meteorological Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A fast backward elimination algorithm is introduced based on a QR decomposition and Givens transformations to prune radial-basis-function networks. Nodes are sequentially removed using an increment of error variance criterion. The procedure is terminated by using a prediction risk criterion so as to obtain a model structure with good generalisation properties. The algorithm can be used to postprocess radial basis centres selected using a k-means routine and, in this mode, it provides a hybrid supervised centre selection approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using the recently-developed mean–variance of logarithms (MVL) diagram, together with the TIGGE archive of medium-range ensemble forecasts from nine different centres, an analysis is presented of the spatiotemporal dynamics of their perturbations, showing how the differences between models and perturbation techniques can explain the shape of their characteristic MVL curves. In particular, a divide is seen between ensembles based on singular vectors or empirical orthogonal functions, and those based on bred vector, Ensemble Transform with Rescaling or Ensemble Kalman Filter techniques. Consideration is also given to the use of the MVL diagram to compare the growth of perturbations within the ensemble with the growth of the forecast error, showing that there is a much closer correspondence for some models than others. Finally, the use of the MVL technique to assist in selecting models for inclusion in a multi-model ensemble is discussed, and an experiment suggested to test its potential in this context.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we propose a new on-line learning algorithm for the non-linear system identification: the swarm intelligence aided multi-innovation recursive least squares (SI-MRLS) algorithm. The SI-MRLS algorithm applies the particle swarm optimization (PSO) to construct a flexible radial basis function (RBF) model so that both the model structure and output weights can be adapted. By replacing an insignificant RBF node with a new one based on the increment of error variance criterion at every iteration, the model remains at a limited size. The multi-innovation RLS algorithm is used to update the RBF output weights which are known to have better accuracy than the classic RLS. The proposed method can produces a parsimonious model with good performance. Simulation result are also shown to verify the SI-MRLS algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given the significance of forecasting in real estate investment decisions, this paper investigates forecast uncertainty and disagreement in real estate market forecasts. It compares the performance of real estate forecasters with non-real estate forecasters. Using the Investment Property Forum (IPF) quarterly survey amongst UK independent real estate forecasters and a similar survey of macro-economic and capital market forecasters, these forecasts are compared with actual performance to assess a number of forecasting issues in the UK over 1999-2004, including forecast error, bias and consensus. The results suggest that both groups are biased, less volatile compared to market returns and inefficient in that forecast errors tend to persist. The strongest finding is that forecasters display the characteristics associated with a consensus indicating herding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A developing polar low is targeted with dropsonde observations to improve the forecast of its landfall. Accurately forecasting a polar low's strength and location remains a challenge; polar lows form over the ocean in poorly observed regions, therefore initial condition errors may contribute significantly to forecast error. The targeted polar low formed in the Norwegian Sea on 3 March 2008, during the Norwegian IPY-THORPEX field campaign. Two flights, six hours apart, released dense networks of dropsondes into a sensitive region covering the polar low and Arctic front to its west. The impact of the targeted observations is assessed using the limited-area Met Office Unified Model and three-dimensional variational (3D-Var) data assimilation scheme. Forecasts were verified using ECMWF analysis data, which show good agreement with both dropsonde data from a flight through the mature polar low, and 10 m QuikSCAT winds. The impact of the targeted data moved southwards with the polar low as it developed and then hit the Norwegian coast after 24 hours. The results show that the forecast of the polar low is sensitive to the initial conditions; targeted observations from the first flight did not improve the forecast, but those from the second flight clearly improved the forecast polar low position and intensity. However, caution should be applied to attributing the forecast improvement to the assimilation of the targeted observations from a single case-study, especially in this case as the forecast improvement is moderate relative to the spread from an operational ensemble forecast

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The assimilation of observations with a forecast is often heavily influenced by the description of the error covariances associated with the forecast. When a temperature inversion is present at the top of the boundary layer (BL), a significant part of the forecast error may be described as a vertical positional error (as opposed to amplitude error normally dealt with in data assimilation). In these cases, failing to account for positional error explicitly is shown t o r esult in an analysis for which the inversion structure is erroneously weakened and degraded. In this article, a new assimilation scheme is proposed to explicitly include the positional error associated with an inversion. This is done through the introduction of an extra control variable to allow position errors in the a priori to be treated simultaneously with the usual amplitude errors. This new scheme, referred to as the ‘floating BL scheme’, is applied to the one-dimensional (vertical) variational assimilation of temperature. The floating BL scheme is tested with a series of idealised experiments a nd with real data from radiosondes. For each idealised experiment, the floating BL scheme gives an analysis which has the inversion structure and position in agreement with the truth, and outperforms the a ssimilation which accounts only for forecast a mplitude error. When the floating BL scheme is used to assimilate a l arge sample of radiosonde data, its ability to give an analysis with an inversion height in better agreement with that observed is confirmed. However, it is found that the use of Gaussian statistics is an inappropriate description o f t he error statistics o f t he extra c ontrol variable. This problem is alleviated by incorporating a non-Gaussian description of the new control variable in the new scheme. Anticipated challenges in implementing the scheme operationally are discussed towards the end of the article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the implementation of a 3D variational (3D-Var) data assimilation scheme for a morphodynamic model applied to Morecambe Bay, UK. A simple decoupled hydrodynamic and sediment transport model is combined with a data assimilation scheme to investigate the ability of such methods to improve the accuracy of the predicted bathymetry. The inverse forecast error covariance matrix is modelled using a Laplacian approximation which is calibrated for the length scale parameter required. Calibration is also performed for the Soulsby-van Rijn sediment transport equations. The data used for assimilation purposes comprises waterlines derived from SAR imagery covering the entire period of the model run, and swath bathymetry data collected by a ship-borne survey for one date towards the end of the model run. A LiDAR survey of the entire bay carried out in November 2005 is used for validation purposes. The comparison of the predictive ability of the model alone with the model-forecast-assimilation system demonstrates that using data assimilation significantly improves the forecast skill. An investigation of the assimilation of the swath bathymetry as well as the waterlines demonstrates that the overall improvement is initially large, but decreases over time as the bathymetry evolves away from that observed by the survey. The result of combining the calibration runs into a pseudo-ensemble provides a higher skill score than for a single optimized model run. A brief comparison of the Optimal Interpolation assimilation method with the 3D-Var method shows that the two schemes give similar results.