987 resultados para Forecasts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainty is known to be a concomitant factor of almost all the real world commodities such as oil prices, stock prices, sales and demand of products. As a consequence, forecasting problems are becoming more and more challenging and ridden with uncertainty. Such uncertainties are generally quantified by statistical tools such as prediction intervals (Pis). Pis quantify the uncertainty related to forecasts by estimating the ranges of the targeted quantities. Pis generated by traditional neural network based approaches are limited by high computational burden and impractical assumptions about the distribution of the data. A novel technique for constructing high quality Pis using support vector machines (SVMs) is being proposed in this paper. The proposed technique directly estimates the upper and lower bounds of the PI in a short time and without any assumptions about the data distribution. The SVM parameters are tuned using particle swarm optimization technique by minimization of a modified Pi-based objective function. Electricity price and demand data of the Ontario electricity market is used to validate the performance of the proposed technique. Several case studies for different months indicate the superior performance of the proposed method in terms of high quality PI generation and shorter computational times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how agents formulate their expectations about Fed behavior is important for market participants because they can potentially use this information to make more accurate estimates of stock and bond prices. Although it is commonly assumed that agents learn over time, there is scant empirical evidence in support of this assumption. Thus, in this paper we test if the forecast of the three month T-bill rate in the Survey of Professional Forecasters (SPF) is consistent with least squares learning when there are discrete shifts in monetary policy. We first derive the mean, variance and autocovariances of the forecast errors from a recursive least squares learning algorithm when there are breaks in the structure of the model. We then apply the Bai and Perron (1998) test for structural change to a forecasting model for the three month T-bill rate in order to identify changes in monetary policy. Having identified the policy regimes, we then estimate the implied biases in the interest rate forecasts within each regime. We find that when the forecast errors from the SPF are corrected for the biases due to shifts in policy, the forecasts are consistent with least squares learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The penetration of intermittent renewable energy sources (IRESs) into power grids has increased in the last decade. Integration of wind farms and solar systems as the major IRESs have significantly boosted the level of uncertainty in operation of power systems. This paper proposes a comprehensive computational framework for quantification and integration of uncertainties in distributed power systems (DPSs) with IRESs. Different sources of uncertainties in DPSs such as electrical load, wind and solar power forecasts and generator outages are covered by the proposed framework. Load forecast uncertainty is assumed to follow a normal distribution. Wind and solar forecast are implemented by a list of prediction intervals (PIs) ranging from 5% to 95%. Their uncertainties are further represented as scenarios using a scenario generation method. Generator outage uncertainty is modeled as discrete scenarios. The integrated uncertainties are further incorporated into a stochastic security-constrained unit commitment (SCUC) problem and a heuristic genetic algorithm is utilized to solve this stochastic SCUC problem. To demonstrate the effectiveness of the proposed method, five deterministic and four stochastic case studies are implemented. Generation costs as well as different reserve strategies are discussed from the perspectives of system economics and reliability. Comparative results indicate that the planned generation costs and reserves are different from the realized ones. The stochastic models show better robustness than deterministic ones. Power systems run a higher level of risk during peak load hours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The value of accurate weather forecast information is substantial. In this paper we examine competition among forecast providers and its implications for the quality of forecasts. A simple economic model shows that an economic bias geographical inequality in forecast accuracy arises due to the extent of the market. Using the unique data on daily high temperature forecasts for 704 U.S. cities, we find that forecast accuracy increases with population and income. Furthermore, the economic bias gets larger when the day of forecasting is closer to the target day; i.e. when people are more concerned about the quality of forecasts. The results hold even after we control for location-specific heterogeneity and difficulty of forecasting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainty of the electricity prices makes the task of accurate forecasting quite difficult for the electricity market participants. Prediction intervals (PIs) are statistical tools which quantify the uncertainty related to forecasts by estimating the ranges of the future electricity prices. Traditional approaches based on neural networks (NNs) generate PIs at the cost of high computational burden and doubtful assumptions about data distributions. In this work, we propose a novel technique that is not plagued with the above limitations and it generates high-quality PIs in a short time. The proposed method directly generates the lower and upper bounds of the future electricity prices using support vector machines (SVM). Optimal model parameters are obtained by the minimization of a modified PI-based objective function using a particle swarm optimization (PSO) technique. The efficiency of the proposed method is illustrated using data from Ontario, Pennsylvania-New Jersey-Maryland (PJM) interconnection day-ahead and real-time markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neural networks (NNs) are an effective tool to model nonlinear systems. However, their forecasting performance significantly drops in the presence of process uncertainties and disturbances. NN-based prediction intervals (PIs) offer an alternative solution to appropriately quantify uncertainties and disturbances associated with point forecasts. In this paper, an NN ensemble procedure is proposed to construct quality PIs. A recently developed lower-upper bound estimation method is applied to develop NN-based PIs. Then, constructed PIs from the NN ensemble members are combined using a weighted averaging mechanism. Simulated annealing and a genetic algorithm are used to optimally adjust the weights for the aggregation mechanism. The proposed method is examined for three different case studies. Simulation results reveal that the proposed method improves the average PI quality of individual NNs by 22%, 18%, and 78% for the first, second, and third case studies, respectively. The simulation study also demonstrates that a 3%-4% improvement in the quality of PIs can be achieved using the proposed method compared to the simple averaging aggregation method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this research is to examine the efficiency of different aggregation algorithms to the forecasts obtained from individual neural network (NN) models in an ensemble. In this study an ensemble of 100 NN models are constructed with a heterogeneous architecture. The outputs from NN models are combined by three different aggregation algorithms. These aggregation algorithms comprise of a simple average, trimmed mean, and a Bayesian model averaging. These methods are utilized with certain modifications and are employed on the forecasts obtained from all individual NN models. The output of the aggregation algorithms is analyzed and compared with the individual NN models used in NN ensemble and with a Naive approach. Thirty-minutes interval electricity demand data from Australian Energy Market Operator (AEMO) and the New York Independent System Operator's web site (NYISO) are used in the empirical analysis. It is observed that the aggregation algorithm perform better than many of the individual NN models. In comparison with the Naive approach, the aggregation algorithms exhibit somewhat better forecasting performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The bulk of existing work on the statistical forecasting of air quality is based on either neural networks or linear regressions, which are both subject to important drawbacks. In particular, while neural networks are complicated and prone to in-sample overfitting, linear regressions are highly dependent on the specification of the regression function. The present paper shows how combining linear regression forecasts can be used to circumvent all of these problems. The usefulness of the proposed combination approach is verified using both Monte Carlo simulation and an extensive application to air quality in Bogota, one of the largest and most polluted cities in Latin America. © 2014 Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this paper is to demonstrate the ability of visualization and simulation techniques to aid and simulate current and future directions in coastal planning. The process of visualization will interrogate the coastal cities of Portland, Apollo Bay, Anglesea and Hobsons Bay in south-eastern Australian coastal seaboard through a progression of projections and simulated forecasts from 2014 to 2050 to see if a process(s) or methodology could help in planning the future growth of coastal settlements. The analysis uses Geographic Information Systems (GIS) associated with planning application software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines volatility asymmetry in a financial market using a stochastic volatility framework. We use the MCMC method for model estimations. There is evidence of volatility asymmetry in the data. Our asymmetric stochastic volatility in mean model, which nests both asymmetric stochastic volatility (ASV) and stochastic volatility in mean models (SVM), indicates ASV sufficiently captures the risk-return relationship; therefore, augmenting it with volatility in mean does not improve its performance. ASV fits the data better and yields more accurate out-of-sample forecasts than alternatives. We also demonstrate that asymmetry mainly emanates from the systematic parts of returns. As a result, it is more pronounced at the market level and the volatility feedback effect dominates the leverage effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prediction interval (PI) has been extensively used to predict the forecasts for nonlinear systems as PI-based forecast is superior over point-forecast to quantify the uncertainties and disturbances associated with the real processes. In addition, PIs bear more information than point-forecasts, such as forecast accuracy. The aim of this paper is to integrate the concept of informative PIs in the control applications to improve the tracking performance of the nonlinear controllers. In the present work, a PI-based controller (PIC) is proposed to control the nonlinear processes. Neural network (NN) inverse model is used as a controller in the proposed method. Firstly, a PI-based model is developed to construct PIs for every sample or time instance. The PIs are then fed to the NN inverse model along with other effective process inputs and outputs. The PI-based NN inverse model predicts the plant input to get the desired plant output. The performance of the proposed PIC controller is examined for a nonlinear process. Simulation results indicate that the tracking performance of the PIC is highly acceptable and better than the traditional NN inverse model-based controller.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper makes use of the idea of prediction intervals (PIs) to capture the uncertainty associated with wind power generation in power systems. Since the forecasting errors cannot be appropriately modeled using distribution probability functions, here we employ a powerful nonparametric approach called lower upper bound estimation (LUBE) method to construct the PIs. The proposed LUBE method uses a new framework based on a combination of PIs to overcome the performance instability of neural networks (NNs) used in the LUBE method. Also, a new fuzzy-based cost function is proposed with the purpose of having more freedom and flexibility in adjusting NN parameters used for construction of PIs. In comparison with the other cost functions in the literature, this new formulation allows the decision-makers to apply their preferences for satisfying the PI coverage probability and PI normalized average width individually. As the optimization tool, bat algorithm with a new modification is introduced to solve the problem. The feasibility and satisfying performance of the proposed method are examined using datasets taken from different wind farms in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.