969 resultados para Forecasting model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A massive amount has been written about forecasting but few articles are written about the development of time series models of call volumes for emergency services. In this study, we use different techniques for forecasting and make the comparison of the techniques for the call volume of the emergency service Rescue 1122 Lahore, Pakistan. For the purpose of this study data is taken from emergency calls of Rescue 1122 from 1st January 2008 to 31 December 2009 and 731 observations are used. Our goal is to develop a simple model that could be used for forecasting the daily call volume. Two different approaches are used for forecasting the daily call volume Box and Jenkins (ARIMA) methodology and Smoothing methodology. We generate the models for forecasting of call volume and present a comparison of the two different techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work concerns forecasting with vector nonlinear time series models when errorsare correlated. Point forecasts are numerically obtained using bootstrap methods andillustrated by two examples. Evaluation concentrates on studying forecast equality andencompassing. Nonlinear impulse responses are further considered and graphically sum-marized by highest density region. Finally, two macroeconomic data sets are used toillustrate our work. The forecasts from linear or nonlinear model could contribute usefulinformation absent in the forecasts form the other model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the electricity hourly load demand in the area covered by a utility situated in the southeast of Brazil. We propose a stochastic model which employs generalized long memory (by means of Gegenbauer processes) to model the seasonal behavior of the load. The model is proposed for sectional data, that is, each hour’s load is studied separately as a single series. This approach avoids modeling the intricate intra-day pattern (load profile) displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is compared with a SARIMA benchmark using the years of 1999 and 2000 as the out-of-sample. The model clearly outperforms the benchmark. We conclude for general long memory in the series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the electricity load demand behavior during the 2001 rationing period, which was implemented because of the Brazilian energetic crisis. The hourly data refers to a utility situated in the southeast of the country. We use the model proposed by Soares and Souza (2003), making use of generalized long memory to model the seasonal behavior of the load. The rationing period is shown to have imposed a structural break in the series, decreasing the load at about 20%. Even so, the forecast accuracy is decreased only marginally, and the forecasts rapidly readapt to the new situation. The forecast errors from this model also permit verifying the public response to pieces of information released regarding the crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this paper is to present a comprehensive emprical analysis of the return and conditional variance of four Brazilian …nancial series using models of the ARCH class. Selected models are then compared regarding forecasting accuracy and goodness-of-…t statistics. To help understanding the empirical results, a self-contained theoretical discussion of ARCH models is also presented in such a way that it is useful for the applied researcher. Empirical results show that although all series share ARCH and are leptokurtic relative to the Normal, the return on the US$ has clearly regime switching and no asymmetry for the variance, the return on COCOA has no asymmetry, while the returns on the CBOND and TELEBRAS have clear signs of asymmetry favoring the leverage e¤ect. Regarding forecasting, the best model overall was the EGARCH(1; 1) in its Gaussian version. Regarding goodness-of-…t statistics, the SWARCH model did well, followed closely by the Student-t GARCH(1; 1)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aiming at empirical findings, this work focuses on applying the HEAVY model for daily volatility with financial data from the Brazilian market. Quite similar to GARCH, this model seeks to harness high frequency data in order to achieve its objectives. Four variations of it were then implemented and their fit compared to GARCH equivalents, using metrics present in the literature. Results suggest that, in such a market, HEAVY does seem to specify daily volatility better, but not necessarily produces better predictions for it, what is, normally, the ultimate goal. The dataset used in this work consists of intraday trades of U.S. Dollar and Ibovespa future contracts from BM&FBovespa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The simulation is a very powerful tool to develop more efficient systems, hence it is been widely used with the goal of productivity improvement. Its results, if compared with other methods, are not always optimum; however, if the experiment is rightly elaborated, its results will represent the real situation, enabling its use with a good level of reliability. This work used the simulation (through the ProModel (R) software) in order to study, understand, model and improve the expenditure system of an enterprise, with a premise of keeping the production-delivery flow considering quick, controlled and reliable conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bit performance prediction has been a challenging problem for the petroleum industry. It is essential in cost reduction associated with well planning and drilling performance prediction, especially when rigs leasing rates tend to follow the projects-demand and barrel-price rises. A methodology to model and predict one of the drilling bit performance evaluator, the Rate of Penetration (ROP), is presented herein. As the parameters affecting the ROP are complex and their relationship not easily modeled, the application of a Neural Network is suggested. In the present work, a dynamic neural network, based on the Auto-Regressive with Extra Input Signals model, or ARX model, is used to approach the ROP modeling problem. The network was applied to a real oil offshore field data set, consisted of information from seven wells drilled with an equal-diameter bit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of the ionosphere on the signals of Global Navigation Satellite Systems (GNSS), such as the Global Positionig System (GPS) and the proposed European Galileo, is dependent on the ionospheric electron density, given by its Total Electron Content (TEC). Ionospheric time-varying density irregularities may cause scintillations, which are fluctuations in phase and amplitude of the signals. Scintillations occur more often at equatorial and high latitudes. They can degrade navigation and positioning accuracy and may cause loss of signal tracking, disrupting safety-critical applications, such as marine navigation and civil aviation. This paper addresses the results of initial research carried out on two fronts that are relevant to GNSS users if they are to counter ionospheric scintillations, i.e. forecasting and mitigating their effects. On the forecasting front, the dynamics of scintillation occurrence were analysed during the severe ionospheric storm that took place on the evening of 30 October 2003, using data from a network of GPS Ionospheric Scintillation and TEC Monitor (GISTM) receivers set up in Northern Europe. Previous results [1] indicated that GPS scintillations in that region can originate from ionospheric plasma structures from the American sector. In this paper we describe experiments that enabled confirmation of those findings. On the mitigation front we used the variance of the output error of the GPS receiver DLL (Delay Locked Loop) to modify the least squares stochastic model applied by an ordinary receiver to compute position. This error was modelled according to [2], as a function of the S4 amplitude scintillation index measured by the GISTM receivers. An improvement of up to 21% in relative positioning accuracy was achieved with this technnique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An agent based model for spatial electric load forecasting using a local movement approach for the spatiotemporal allocation of the new loads in the service zone is presented. The density of electrical load for each of the major consumer classes in each sub-zone is used as the current state of the agents. The spatial growth is simulated with a walking agent who starts his path in one of the activity centers of the city and goes to the limits of the city following a radial path depending on the different load levels. A series of update rules are established to simulate the S growth behavior and the complementarity between classes. The results are presented in future load density maps. The tests in a real system from a mid-size city show a high rate of success when compared with other techniques. The most important features of this methodology are the need for few data and the simplicity of the algorithm, allowing for future scalability. © 2009 IEEE.