21 resultados para analysts’ forecast accuracy

em Repositório digital da Fundação Getúlio Vargas - FGV


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial companies in developing countries are facing rapid growths, and this requires having in place the best organizational processes to cope with the market demand. Sales forecasting, as a tool aligned with the general strategy of the company, needs to be as much accurate as possible, in order to achieve the sales targets by making available the right information for purchasing, planning and control of production areas, and finally attending in time and form the demand generated. The present dissertation uses a single case study from the subsidiary of an international explosives company based in Brazil, Maxam, experiencing high growth in sales, and therefore facing the challenge to adequate its structure and processes properly for the rapid growth expected. Diverse sales forecast techniques have been analyzed to compare the actual monthly sales forecast, based on the sales force representatives’ market knowledge, with forecasts based on the analysis of historical sales data. The dissertation findings show how the combination of both qualitative and quantitative forecasts, by the creation of a combined forecast that considers both client´s demand knowledge from the sales workforce with time series analysis, leads to the improvement on the accuracy of the company´s sales forecast.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Despite the commonly held belief that aggregate data display short-run comovement, there has been little discussion about the econometric consequences of this feature of the data. We use exhaustive Monte-Carlo simulations to investigate the importance of restrictions implied by common-cyclical features for estimates and forecasts based on vector autoregressive models. First, we show that the ìbestî empirical model developed without common cycle restrictions need not nest the ìbestî model developed with those restrictions. This is due to possible differences in the lag-lengths chosen by model selection criteria for the two alternative models. Second, we show that the costs of ignoring common cyclical features in vector autoregressive modelling can be high, both in terms of forecast accuracy and efficient estimation of variance decomposition coefficients. Third, we find that the Hannan-Quinn criterion performs best among model selection criteria in simultaneously selecting the lag-length and rank of vector autoregressions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies the electricity load demand behavior during the 2001 rationing period, which was implemented because of the Brazilian energetic crisis. The hourly data refers to a utility situated in the southeast of the country. We use the model proposed by Soares and Souza (2003), making use of generalized long memory to model the seasonal behavior of the load. The rationing period is shown to have imposed a structural break in the series, decreasing the load at about 20%. Even so, the forecast accuracy is decreased only marginally, and the forecasts rapidly readapt to the new situation. The forecast errors from this model also permit verifying the public response to pieces of information released regarding the crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A combinação de previsões é caracterizada pelo aumento da precisão de prognósticos decorrente da complementaridade da informação contida nas previsões individuais. Este trabalho parte das idéias do consagrado artigo de Bates e Granger (1969) com o objetivo de investigar se há como elevar a precisão de previsões de índices de preços. Há evidências de que, embora os ganhos da combinação sejam limitados, os riscos decorrentes da combinação são menores que seus benefícios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zeromean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise based upon data from a well known survey is also presented. Overall, theoretical and empirical results show promise for the feasible bias-corrected average forecast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular it delivers a zero-limiting mean-squared error if the number of forecasts and the number of post-sample time periods is sufficiently large. We also develop a zero-mean test for the average bias. Monte-Carlo simulations are conducted to evaluate the performance of this new technique in finite samples. An empirical exercise, based upon data from well known surveys is also presented. Overall, these results show promise for the bias-corrected average forecast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a novel approach to econometric forecasting of stationary and ergodic time series within a panel-data framework. Our key element is to employ the (feasible) bias-corrected average forecast. Using panel-data sequential asymptotics we show that it is potentially superior to other techniques in several contexts. In particular, it is asymptotically equivalent to the conditional expectation, i.e., has an optimal limiting mean-squared error. We also develop a zeromean test for the average bias and discuss the forecast-combination puzzle in small and large samples. Monte-Carlo simulations are conducted to evaluate the performance of the feasible bias-corrected average forecast in finite samples. An empirical exercise, based upon data from a well known survey is also presented. Overall, these results show promise for the feasible bias-corrected average forecast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual horizons. The data to be used consists of metal-commodity prices in a monthly frequency from 1957 to 2012 from the International Financial Statistics of the IMF on individual metal series. We will also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009) , which are available for download. Regarding short- and long-run comovement, we will apply the techniques and the tests proposed in the common-feature literature to build parsimonious VARs, which possibly entail quasi-structural relationships between different commodity prices and/or between a given commodity price and its potential demand determinants. These parsimonious VARs will be later used as forecasting models to be combined to yield metal-commodity prices optimal forecasts. Regarding out-of-sample forecasts, we will use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates to forecast the returns and prices of metal commodities. With the forecasts of a large number of models (N large) and a large number of time periods (T large), we will apply the techniques put forth by the common-feature literature on forecast combinations. The main contribution of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding forecasting, we show that models incorporating (short-run) commoncycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation. Still, in most cases, forecast combination techniques outperform individual models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, extracted from the IFS, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009). We investigate short- and long-run comovement by applying the techniques and the tests proposed in the common-feature literature. One of the main contributions of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding out-of-sample forecasts, our main contribution is to show the benefits of forecast-combination techniques, which outperform individual-model forecasts - including the random-walk model. We use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates and functional forms to forecast the returns and prices of metal commodities. Using a large number of models (N large) and a large number of time periods (T large), we apply the techniques put forth by the common-feature literature on forecast combinations. Empirically, we show that models incorporating (short-run) common-cycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combining forecast is characterized by an improvement in the accuracy of the prognoses due to the complementarity of the information contained in individual forecasts. This paper follows the seminal work of Bates and Granger (1969) with the objective of investigating whether room exists to improve the accuracy in price index forecasts. There is evidence that even though the gains in combining forecasts are limited, the risks incurred from combining forecasts are less than the benefits gained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Outliers são observações que parecem ser inconsistentes com as demais. Também chamadas de valores atípicos, extremos ou aberrantes, estas inconsistências podem ser causadas por mudanças de política ou crises econômicas, ondas inesperadas de frio ou calor, erros de medida ou digitação, entre outras. Outliers não são necessariamente valores incorretos, mas, quando provenientes de erros de medida ou digitação, podem distorcer os resultados de uma análise e levar o pesquisador à conclusões equivocadas. O objetivo deste trabalho é estudar e comparar diferentes métodos para detecção de anormalidades em séries de preços do Índice de Preços ao Consumidor (IPC), calculado pelo Instituto Brasileiro de Economia (IBRE) da Fundação Getulio Vargas (FGV). O IPC mede a variação dos preços de um conjunto fixo de bens e serviços componentes de despesas habituais das famílias com nível de renda situado entre 1 e 33 salários mínimos mensais e é usado principalmente como um índice de referência para avaliação do poder de compra do consumidor. Além do método utilizado atualmente no IBRE pelos analistas de preços, os métodos considerados neste estudo são: variações do Método do IBRE, Método do Boxplot, Método do Boxplot SIQR, Método do Boxplot Ajustado, Método de Cercas Resistentes, Método do Quartil, do Quartil Modificado, Método do Desvio Mediano Absoluto e Algoritmo de Tukey. Tais métodos foram aplicados em dados pertencentes aos municípios Rio de Janeiro e São Paulo. Para que se possa analisar o desempenho de cada método, é necessário conhecer os verdadeiros valores extremos antecipadamente. Portanto, neste trabalho, tal análise foi feita assumindo que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers. O Método do IBRE é bastante correlacionado com os preços alterados ou descartados pelos analistas. Sendo assim, a suposição de que os preços alterados ou descartados pelos analistas são os verdadeiros valores extremos pode influenciar os resultados, fazendo com que o mesmo seja favorecido em comparação com os demais métodos. No entanto, desta forma, é possível computar duas medidas através das quais os métodos são avaliados. A primeira é a porcentagem de acerto do método, que informa a proporção de verdadeiros outliers detectados. A segunda é o número de falsos positivos produzidos pelo método, que informa quantos valores precisaram ser sinalizados para um verdadeiro outlier ser detectado. Quanto maior for a proporção de acerto gerada pelo método e menor for a quantidade de falsos positivos produzidos pelo mesmo, melhor é o desempenho do método. Sendo assim, foi possível construir um ranking referente ao desempenho dos métodos, identificando o melhor dentre os analisados. Para o município do Rio de Janeiro, algumas das variações do Método do IBRE apresentaram desempenhos iguais ou superiores ao do método original. Já para o município de São Paulo, o Método do IBRE apresentou o melhor desempenho. Em trabalhos futuros, espera-se testar os métodos em dados obtidos por simulação ou que constituam bases largamente utilizadas na literatura, de forma que a suposição de que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers não interfira nos resultados.