905 resultados para Forecasting methodologies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a comparative analysis of the long-term electric power forecasting methodologies used in some South American countries, is presented. The purpose of this study is to compare and observe if such methodologies have some similarities, and also examine the behavior of the results when they are applied to the Brazilian electric market. The abovementioned power forecasts were performed regarding the main four consumption classes (residential, industrial, commercial and rural) which are responsible for approximately 90% of the national consumption. The tool used in this analysis was the SAS (c) program. The outcome of this study allowed identifying various methodological similarities, mainly those related to the econometric variables used by these methods. This fact strongly conditioned the comparative results obtained.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents several forecasting methodologies based on the application of Artificial Neural Networks (ANN) and Support Vector Machines (SVM), directed to the prediction of the solar radiance intensity. The methodologies differ from each other by using different information in the training of the methods, i.e, different environmental complementary fields such as the wind speed, temperature, and humidity. Additionally, different ways of considering the data series information have been considered. Sensitivity testing has been performed on all methodologies in order to achieve the best parameterizations for the proposed approaches. Results show that the SVM approach using the exponential Radial Basis Function (eRBF) is capable of achieving the best forecasting results, and in half execution time of the ANN based approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Logística

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines to what extent crops and their environment should be viewed as a coupled system. Crop impact assessments currently use climate model output offline to drive process-based crop models. However, in regions where local climate is sensitive to land surface conditions more consistent assessments may be produced with the crop model embedded within the land surface scheme of the climate model. Using a recently developed coupled crop–climate model, the sensitivity of local climate, in particular climate variability, to climatically forced variations in crop growth throughout the tropics is examined by comparing climates simulated with dynamic and prescribed seasonal growth of croplands. Interannual variations in land surface properties associated with variations in crop growth and development were found to have significant impacts on near-surface fluxes and climate; for example, growing season temperature variability was increased by up to 40% by the inclusion of dynamic crops. The impact was greatest in dry years where the response of crop growth to soil moisture deficits enhanced the associated warming via a reduction in evaporation. Parts of the Sahel, India, Brazil, and southern Africa were identified where local climate variability is sensitive to variations in crop growth, and where crop yield is sensitive to variations in surface temperature. Therefore, offline seasonal forecasting methodologies in these regions may underestimate crop yield variability. The inclusion of dynamic crops also altered the mean climate of the humid tropics, highlighting the importance of including dynamical vegetation within climate models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, power systems have experienced many changes in their paradigm. The introduction of new players in the management of distributed generation leads to the decentralization of control and decision-making, so that each player is able to play in the market environment. In the new context, it will be very relevant that aggregator players allow midsize, small and micro players to act in a competitive environment. In order to achieve their objectives, virtual power players and single players are required to optimize their energy resource management process. To achieve this, it is essential to have financial resources capable of providing access to appropriate decision support tools. As small players have difficulties in having access to such tools, it is necessary that these players can benefit from alternative methodologies to support their decisions. This paper presents a methodology, based on Artificial Neural Networks (ANN), and intended to support smaller players. In this case the present methodology uses a training set that is created using energy resource scheduling solutions obtained using a mixed-integer linear programming (MIP) approach as the reference optimization methodology. The trained network is used to obtain locational marginal prices in a distribution network. The main goal of the paper is to verify the accuracy of the ANN based approach. Moreover, the use of a single ANN is compared with the use of two or more ANN to forecast the locational marginal price.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Price forecast is a matter of concern for all participants in electricity markets, from suppliers to consumers through policy makers, which are interested in the accurate forecast of day-ahead electricity prices either for better decisions making or for an improved evaluation of the effectiveness of market rules and structure. This paper describes a methodology to forecast market prices in an electricity market using an ARIMA model applied to the conjectural variations of the firms acting in an electricity market. This methodology is applied to the Iberian electricity market to forecast market prices in the 24 hours of a working day. The methodology was then compared with two other methodologies, one called naive and the other a direct forecast of market prices using also an ARIMA model. Results show that the conjectural variations price forecast performs better than the naive and that it performs slightly better than the direct price forecast.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a wind speed forecasting model that contributes to the development and implementation of adequate methodologies for Energy Resource Man-agement in a distribution power network, with intensive use of wind based power generation. The proposed fore-casting methodology aims to support the operation in the scope of the intraday resources scheduling model, name-ly with a time horizon of 10 minutes. A case study using a real database from the meteoro-logical station installed in the GECAD renewable energy lab was used. A new wind speed forecasting model has been implemented and it estimated accuracy was evalu-ated and compared with a previous developed forecast-ing model. Using as input attributes the information of the wind speed concerning the previous 3 hours enables to obtain results with high accuracy for the wind short-term forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting future sales is one of the most important issues that is beyond all strategic and planning decisions in effective operations of retail businesses. For profitable retail businesses, accurate demand forecasting is crucial in organizing and planning production, purchasing, transportation and labor force. Retail sales series belong to a special type of time series that typically contain trend and seasonal patterns, presenting challenges in developing effective forecasting models. This work compares the forecasting performance of state space models and ARIMA models. The forecasting performance is demonstrated through a case study of retail sales of five different categories of women footwear: Boots, Booties, Flats, Sandals and Shoes. On both methodologies the model with the minimum value of Akaike's Information Criteria for the in-sample period was selected from all admissible models for further evaluation in the out-of-sample. Both one-step and multiple-step forecasts were produced. The results show that when an automatic algorithm the overall out-of-sample forecasting performance of state space and ARIMA models evaluated via RMSE, MAE and MAPE is quite similar on both one-step and multi-step forecasts. We also conclude that state space and ARIMA produce coverage probabilities that are close to the nominal rates for both one-step and multi-step forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multinodal load forecasting deals with the loads of several interest nodes in an electrical network system, which is also known as bus load forecasting. To perform this demand, it is necessary a technique that is precise, trustable and has a short-time processing. This paper proposes two methodologies based on general regression neural networks for short-term multinodal load forecasting. The first individually forecast the local loads and the second forecast the global load and individually forecast the load participation factors to estimate the local loads. To design the forecasters it wasn't necessary the previous study of the local loads. Tests were made using a New Zealand distribution subsystem and the results obtained are compatible with the ones founded in the specialized literature. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting the time, location, nature, and scale of volcanic eruptions is one of the most urgent aspects of modern applied volcanology. The reliability of probabilistic forecasting procedures is strongly related to the reliability of the input information provided, implying objective criteria for interpreting the historical and monitoring data. For this reason both, detailed analysis of past data and more basic research into the processes of volcanism, are fundamental tasks of a continuous information-gain process; in this way the precursor events of eruptions can be better interpreted in terms of their physical meanings with correlated uncertainties. This should lead to better predictions of the nature of eruptive events. In this work we have studied different problems associated with the long- and short-term eruption forecasting assessment. First, we discuss different approaches for the analysis of the eruptive history of a volcano, most of them generally applied for long-term eruption forecasting purposes; furthermore, we present a model based on the characteristics of a Brownian passage-time process to describe recurrent eruptive activity, and apply it for long-term, time-dependent, eruption forecasting (Chapter 1). Conversely, in an effort to define further monitoring parameters as input data for short-term eruption forecasting in probabilistic models (as for example, the Bayesian Event Tree for eruption forecasting -BET_EF-), we analyze some characteristics of typical seismic activity recorded in active volcanoes; in particular, we use some methodologies that may be applied to analyze long-period (LP) events (Chapter 2) and volcano-tectonic (VT) seismic swarms (Chapter 3); our analysis in general are oriented toward the tracking of phenomena that can provide information about magmatic processes. Finally, we discuss some possible ways to integrate the results presented in Chapters 1 (for long-term EF), 2 and 3 (for short-term EF) in the BET_EF model (Chapter 4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two nonlinear techniques, namely, recurrent neural networks and kernel recursive least squares regressiontechniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a nave random walk model. The best models were nonlinear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. Beyond its economic findings, our study is in the tradition of physicists' long-standing interest in the interconnections among statistical mechanics, neural networks, and related nonparametric statistical methods, and suggests potential avenues of extension for such studies. © 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^