994 resultados para forecasting methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the research is to define practical profit which can be achieved using neural network methods as a prediction instrument. The thesis investigates the ability of neural networks to forecast future events. This capability is checked on the example of price prediction during intraday trading on stock market. The executed experiments show predictions of average 1, 2, 5 and 10 minutes’ prices based on data of one day and made by two different types of forecasting systems. These systems are based on the recurrent neural networks and back propagation neural nets. The precision of the predictions is controlled by the absolute error and the error of market direction. The economical effectiveness is estimated by a special trading system. In conclusion, the best structures of neural nets are tested with data of 31 days’ interval. The best results of the average percent of profit from one transaction (buying + selling) are 0.06668654, 0.188299453, 0.349854787 and 0.453178626, they were achieved for prediction periods 1, 2, 5 and 10 minutes. The investigation can be interesting for the investors who have access to a fast information channel with a possibility of every-minute data refreshment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity price forecasting has become an important area of research in the aftermath of the worldwide deregulation of the power industry that launched competitive electricity markets now embracing all market participants including generation and retail companies, transmission network providers, and market managers. Based on the needs of the market, a variety of approaches forecasting day-ahead electricity prices have been proposed over the last decades. However, most of the existing approaches are reasonably effective for normal range prices but disregard price spike events, which are caused by a number of complex factors and occur during periods of market stress. In the early research, price spikes were truncated before application of the forecasting model to reduce the influence of such observations on the estimation of the model parameters; otherwise, a very large forecast error would be generated on price spike occasions. Electricity price spikes, however, are significant for energy market participants to stay competitive in a market. Accurate price spike forecasting is important for generation companies to strategically bid into the market and to optimally manage their assets; for retailer companies, since they cannot pass the spikes onto final customers, and finally, for market managers to provide better management and planning for the energy market. This doctoral thesis aims at deriving a methodology able to accurately predict not only the day-ahead electricity prices within the normal range but also the price spikes. The Finnish day-ahead energy market of Nord Pool Spot is selected as the case market, and its structure is studied in detail. It is almost universally agreed in the forecasting literature that no single method is best in every situation. Since the real-world problems are often complex in nature, no single model is able to capture different patterns equally well. Therefore, a hybrid methodology that enhances the modeling capabilities appears to be a possibly productive strategy for practical use when electricity prices are predicted. The price forecasting methodology is proposed through a hybrid model applied to the price forecasting in the Finnish day-ahead energy market. The iterative search procedure employed within the methodology is developed to tune the model parameters and select the optimal input set of the explanatory variables. The numerical studies show that the proposed methodology has more accurate behavior than all other examined methods most recently applied to case studies of energy markets in different countries. The obtained results can be considered as providing extensive and useful information for participants of the day-ahead energy market, who have limited and uncertain information for price prediction to set up an optimal short-term operation portfolio. Although the focus of this work is primarily on the Finnish price area of Nord Pool Spot, given the result of this work, it is very likely that the same methodology will give good results when forecasting the prices on energy markets of other countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The desire to create a statistical or mathematical model, which would allow predicting the future changes in stock prices, was born many years ago. Economists and mathematicians are trying to solve this task by applying statistical analysis and physical laws, but there are still no satisfactory results. The main reason for this is that a stock exchange is a non-stationary, unstable and complex system, which is influenced by many factors. In this thesis the New York Stock Exchange was considered as the system to be explored. A topological analysis, basic statistical tools and singular value decomposition were conducted for understanding the behavior of the market. Two methods for normalization of initial daily closure prices by Dow Jones and S&P500 were introduced and applied for further analysis. As a result, some unexpected features were identified, such as a shape of distribution of correlation matrix, a bulk of which is shifted to the right hand side with respect to zero. Also non-ergodicity of NYSE was confirmed graphically. It was shown, that singular vectors differ from each other by a constant factor. There are for certain results no clear conclusions from this work, but it creates a good basis for the further analysis of market topology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research concerns different statistical methods that assist to increase the demand forecasting accuracy of company X’s forecasting model. Current forecasting process was analyzed in details. As a result, graphical scheme of logical algorithm was developed. Based on the analysis of the algorithm and forecasting errors, all the potential directions for model future improvements in context of its accuracy were gathered into the complete list. Three improvement directions were chosen for further practical research, on their basis, three test models were created and verified. Novelty of this work lies in the methodological approach of the original analysis of the model, which identified its critical points, as well as the uniqueness of the developed test models. Results of the study formed the basis of the grant of the Government of St. Petersburg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the past 20 years, researchers have applied the Kalman filter to the modeling and forecasting the term structure of interest rates. Despite its impressive performance in in-sample fitting yield curves, little research has focused on the out-of-sample forecast of yield curves using the Kalman filter. The goal of this thesis is to develop a unified dynamic model based on Diebold and Li (2006) and Nelson and Siegel’s (1987) three-factor model, and estimate this dynamic model using the Kalman filter. We compare both in-sample and out-of-sample performance of our dynamic methods with various other models in the literature. We find that our dynamic model dominates existing models in medium- and long-horizon yield curve predictions. However, the dynamic model should be used with caution when forecasting short maturity yields

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Production Planning and Control (PPC) systems have grown and changed because of the developments in planning tools and models as well as the use of computers and information systems in this area. Though so much is available in research journals, practice of PPC is lagging behind and does not use much from published research. The practices of PPC in SMEs lag behind because of many reasons, which need to be explored This research work deals with the effect of identified variables such as forecasting, planning and control methods adopted, demographics of the key person, standardization practices followed, effect of training, learning and IT usage on firm performance. A model and framework has been developed based on literature. Empirical testing of the model has been done after collecting data using a questionnaire schedule administered among the selected respondents from Small and Medium Enterprises (SMEs) in India. Final data included 382 responses. Hypotheses linking SME performance with the use of forecasting, planning and controlling were formed and tested. Exploratory factor analysis was used for data reduction and for identifying the factor structure. High and low performing firms were classified using a Logistic Regression model. A confirmatory factor analysis was used to study the structural relationship between firm performance and dependent variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planners in public and private institutions would like coherent forecasts of the components of age-specic mortality, such as causes of death. This has been di cult to achieve because the relative values of the forecast components often fail to behave in a way that is coherent with historical experience. In addition, when the group forecasts are combined the result is often incompatible with an all-groups forecast. It has been shown that cause-specic mortality forecasts are pessimistic when compared with all-cause forecasts (Wilmoth, 1995). This paper abandons the conventional approach of using log mortality rates and forecasts the density of deaths in the life table. Since these values obey a unit sum constraint for both conventional single-decrement life tables (only one absorbing state) and multiple-decrement tables (more than one absorbing state), they are intrinsically relative rather than absolute values across decrements as well as ages. Using the methods of Compositional Data Analysis pioneered by Aitchison (1986), death densities are transformed into the real space so that the full range of multivariate statistics can be applied, then back-transformed to positive values so that the unit sum constraint is honoured. The structure of the best-known, single-decrement mortality-rate forecasting model, devised by Lee and Carter (1992), is expressed in compositional form and the results from the two models are compared. The compositional model is extended to a multiple-decrement form and used to forecast mortality by cause of death for Japan

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El foc bacterià és una malaltia que afecta a plantes de la família de la rosàcies, causada pel bacteri Erwinia amylovora. El seu rang d'hostes inclou arbres fruiters, com la perera, la pomera o el codonyer, i plantes ornamentals de gran interès comercial i econòmic. Actualment, la malaltia s'ha dispersat i es troba àmpliament distribuïda en totes les zones de clima temperat del món. A Espanya, on la malaltia no és endèmica, el foc bacterià es va detectar per primer cop al 1995 al nord del país (Euskadi) i posteriorment, han aparegut varis focus en altres localitzacions, que han estat convenientment eradicats. El control del foc bacterià, és molt poc efectiu en plantes afectades per la malaltia, de manera que es basa en mesures encaminades a evitar la dispersió del patogen, i la introducció de la malaltia en regions no endèmiques. En aquest treball, la termoteràpia ha estat avaluada com a mètode d'eradicació d'E. amylovora de material vegetal de propagació asimptomàtic. S'ha demostrat que la termoteràpia és un mètode viable d'eradicar E. amylovora de material de propagació. Gairebé totes les espècies i varietats de rosàcies mantingudes en condicions d'humitat sobrevivien 7 hores a 45 ºC i més de 3 hores a 50 ºC, mentre que més d'1 hora d'exposició a 50 ºC amb calor seca produïa danys en el material vegetal i reduïa la brotació. Tractaments de 60 min a 45 ºC o 30 min a 50 ºC van ser suficients per reduir la població epífita d'E. amylovora a nivells no detectables (5 x 102 ufc g-1 p.f.) en branques de perera. Els derivats dels fosfonats i el benzotiadiazol són efectius en el control del foc bacterià en perera i pomera, tant en condicions de laboratori, com d'hivernacle i camp. Els inductors de defensa de les plantes redueixen els nivells de malaltia fins al 40-60%. Els intervals de temps mínims per aconseguir el millor control de la malaltia van ser 5 dies pel fosetil-Al, i 7 dies per l'etefon i el benzotiadiazol, i les dosis òptimes pel fosetil-Al i el benzotiadiazol van ser 3.72 g HPO32- L-1 i 150 mg i.a. L-1, respectivament. Es millora l'eficàcia del fosetil-Al i del benzotiadiazol en el control del foc bacterià, quan es combinen amb els antibiòtics a la meitat de la dosi d'aquests últims. Tot i que l'estratègia de barrejar productes és més pràctica i fàcil de dur a terme a camp, que l'estratègia de combinar productes, el millor nivell de control de la malaltia s'aconsegueix amb l'estratègia de combinar productes. Es va analitzar a nivell histològic i ultrastructural l'efecte del benzotiadiazol i dels fosfonats en la interacció Erwinia amylovora-perera. Ni el benzotiadiazol, ni el fosetil-Al, ni l'etefon van induir canvis estructurals en els teixits de perera 7 dies després de la seva aplicació. No obstant, després de la inoculació d'E. amylovora es va observar en plantes tractades amb fosetil-Al i etefon una desorganització estructural cel·lular, mentre que en les plantes tractades amb benzotiadiazol aquestes alteracions tissulars van ser retardades. S'han avaluat dos models (Maryblyt, Cougarblight) en un camp a Espanya afectat per la malaltia, per determinar la precisió de les prediccions. Es van utilitzar dos models per elaborar el mapa de risc, el BRS-Powell combinat i el BIS95 modificat. Els resultats van mostrar dos zones amb elevat i baix risc de la malaltia. Maryblyt i Cougarblight són dos models de fàcil ús, tot i que la seva implementació en programes de maneig de la malaltia requereix que siguin avaluats i validats per un període de temps més llarg i en àrees on la malaltia hi estigui present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research has suggested that forecast evaluation on the basis of standard statistical loss functions could prefer models which are sub-optimal when used in a practical setting. This paper explores a number of statistical models for predicting the daily volatility of several key UK financial time series. The out-of-sample forecasting performance of various linear and GARCH-type models of volatility are compared with forecasts derived from a multivariate approach. The forecasts are evaluated using traditional metrics, such as mean squared error, and also by how adequately they perform in a modern risk management setting. We find that the relative accuracies of the various methods are highly sensitive to the measure used to evaluate them. Such results have implications for any econometric time series forecasts which are subsequently employed in financial decisionmaking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A set of random variables is exchangeable if its joint distribution function is invariant under permutation of the arguments. The concept of exchangeability is discussed, with a view towards potential application in evaluating ensemble forecasts. It is argued that the paradigm of ensembles being an independent draw from an underlying distribution function is probably too narrow; allowing ensemble members to be merely exchangeable might be a more versatile model. The question is discussed whether established methods of ensemble evaluation need alteration under this model, with reliability being given particular attention. It turns out that the standard methodology of rank histograms can still be applied. As a first application of the exchangeability concept, it is shown that the method of minimum spanning trees to evaluate the reliability of high dimensional ensembles is mathematically sound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many macroeconomic series, such as U.S. real output growth, are sampled quarterly, although potentially useful predictors are often observed at a higher frequency. We look at whether a mixed data-frequency sampling (MIDAS) approach can improve forecasts of output growth. The MIDAS specification used in the comparison uses a novel way of including an autoregressive term. We find that the use of monthly data on the current quarter leads to significant improvement in forecasting current and next quarter output growth, and that MIDAS is an effective way to exploit monthly data compared with alternative methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following trends in operational weather forecasting, where ensemble prediction systems (EPS) are now increasingly the norm, flood forecasters are beginning to experiment with using similar ensemble methods. Most of the effort to date has focused on the substantial technical challenges of developing coupled rainfall-runoff systems to represent the full cascade of uncertainties involved in predicting future flooding. As a consequence much less attention has been given to the communication and eventual use of EPS flood forecasts. Drawing on interviews and other research with operational flood forecasters from across Europe, this paper highlights a number of challenges to communicating and using ensemble flood forecasts operationally. It is shown that operational flood forecasters understand the skill, operational limitations, and informational value of EPS products in a variety of different and sometimes contradictory ways. Despite the efforts of forecasting agencies to design effective ways to communicate EPS forecasts to non-experts, operational flood forecasters were often skeptical about the ability of forecast recipients to understand or use them appropriately. It is argued that better training and closer contacts between operational flood forecasters and EPS system designers can help ensure the uncertainty represented by EPS forecasts is represented in ways that are most appropriate and meaningful for their intended consumers, but some fundamental political and institutional challenges to using ensembles, such as differing attitudes to false alarms and to responsibility for management of blame in the event of poor or mistaken forecasts are also highlighted. Copyright © 2010 Royal Meteorological Society.