910 resultados para Forecasts
Resumo:
Using a sequence of nested multivariate models that are VAR-based, we discuss different layers of restrictions imposed by present-value models (PVM hereafter) on the VAR in levels for series that are subject to present-value restrictions. Our focus is novel - we are interested in the short-run restrictions entailed by PVMs (Vahid and Engle, 1993, 1997) and their implications for forecasting. Using a well-known database, kept by Robert Shiller, we implement a forecasting competition that imposes different layers of PVM restrictions. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to the unrestricted VAR. Moreover, imposing short-run restrictions produces forecast winners 70% of the time for the target variables of PVMs and 63.33% of the time when all variables in the system are considered.
Resumo:
This paper investigates the role of consumption-wealth ratio on predicting future stock returns through a panel approach. We follow the theoretical framework proposed by Lettau and Ludvigson (2001), in which a model derived from a nonlinear consumer’s budget constraint is used to settle the link between consumption-wealth ratio and stock returns. Using G7’s quarterly aggregate and financial data ranging from the first quarter of 1981 to the first quarter of 2014, we set an unbalanced panel that we use for both estimating the parameters of the cointegrating residual from the shared trend among consumption, asset wealth and labor income, cay, and performing in and out-of-sample forecasting regressions. Due to the panel structure, we propose different methodologies of estimating cay and making forecasts from the one applied by Lettau and Ludvigson (2001). The results indicate that cay is in fact a strong and robust predictor of future stock return at intermediate and long horizons, but presents a poor performance on predicting one or two-quarter-ahead stock returns.
Resumo:
Este trabalho avalia as previsões de três métodos não lineares — Markov Switching Autoregressive Model, Logistic Smooth Transition Autoregressive Model e Autometrics com Dummy Saturation — para a produção industrial mensal brasileira e testa se elas são mais precisas que aquelas de preditores naive, como o modelo autorregressivo de ordem p e o mecanismo de double differencing. Os resultados mostram que a saturação com dummies de degrau e o Logistic Smooth Transition Autoregressive Model podem ser superiores ao mecanismo de double differencing, mas o modelo linear autoregressivo é mais preciso que todos os outros métodos analisados.
Resumo:
Our focus is on information in expectation surveys that can now be built on thousands (or millions) of respondents on an almost continuous-time basis (big data) and in continuous macroeconomic surveys with a limited number of respondents. We show that, under standard microeconomic and econometric techniques, survey forecasts are an affine function of the conditional expectation of the target variable. This is true whether or not the survey respondent knows the data-generating process (DGP) of the target variable or the econometrician knows the respondents individual loss function. If the econometrician has a mean-squared-error risk function, we show that asymptotically efficient forecasts of the target variable can be built using Hansens (Econometrica, 1982) generalized method of moments in a panel-data context, when N and T diverge or when T diverges with N xed. Sequential asymptotic results are obtained using Phillips and Moon s (Econometrica, 1999) framework. Possible extensions are also discussed.
Resumo:
This paper investigates the expectations formation process of economic agents about inflation rate. Using the Market Expectations System of Central Bank of Brazil, we perceive that agents do not update their forecasts every period and that even agents who update disagree in their predictions. We then focus on the two most popular types of inattention models that have been discussed in the recent literature: sticky-information and noisy-information models. Estimating a hybrid model we find that, although formally fitting the Brazilian data, it happens at the cost of a much higher degree of information rigidity than observed.
Resumo:
This work assesses the forecasts of three nonlinear methods | Markov Switching Autoregressive Model, Logistic Smooth Transition Auto-regressive Model, and Auto-metrics with Dummy Saturation | for the Brazilian monthly industrial production and tests if they are more accurate than those of naive predictors such as the autoregressive model of order p and the double di erencing device. The results show that the step dummy saturation and the logistic smooth transition autoregressive can be superior to the double di erencing device, but the linear autoregressive model is more accurate than all the other methods analyzed.
Resumo:
O trabalho tem como objetivo verificar a existência e a relevância dos Efeitos Calendário em indicadores industriais. São explorados modelos univariados lineares para o indicador mensal da produção industrial brasileira e alguns de seus componentes. Inicialmente é realizada uma análise dentro da amostra valendo-se de modelos estruturais de espaço-estado e do algoritmo de seleção Autometrics, a qual aponta efeito significante da maioria das variáveis relacionadas ao calendário. Em seguida, através do procedimento de Diebold-Mariano (1995) e do Model Confidence Set, proposto por Hansen, Lunde e Nason (2011), são realizadas comparações de previsões de modelos derivados do Autometrics com um dispositivo simples de Dupla Diferença para um horizonte de até 24 meses à frente. Em geral, os modelos Autometrics que consideram as variáveis de calendário se mostram superiores nas projeções de 1 a 2 meses adiante e superam o modelo simples em todos os horizontes. Quando se agrega os componentes de categoria de uso para formar o índice industrial total, há evidências de ganhos nas projeções de prazo mais curto.
Resumo:
This paper investigates the expectations formation process of economic agents about infl ation rate. Using the Market Expectations System of Central Bank of Brazil, we perceive that agents do not update their forecasts every period and that even agents who update disagree in their predictions. We then focus on the two most popular types of inattention models that have been discussed in the recent literature: sticky-information and noisy-information models. Estimating a hybrid model we fi nd that, although formally fi tting the Brazilian data, it happens at the cost of a much higher degree of information rigidity than observed.
Resumo:
Using a unique dataset on Brazilian nominal and real yield curves combined with daily survey forecasts of macroeconomic variables such as GDP growth, inflation, and exchange rate movements, we identify the effect of surprises to the Brazilian interbank target rate on expected future nominal and real short rates, term premia, and inflation expectations. We find that positive surprises to target rates lead to higher expected nominal and real interest rates and reduced nominal and inflation term premia. We also find a strongly positive relation between both real and nominal term premia and measures of dispersion in survey forecasts. Uncertainty about future exchange rates is a particularly important driver of variations in Brazilian term premia.
Resumo:
The financial crisis and Great Recession have been followed by a jobs shortage crisis that most forecasts predict will persist for years given current policies. This paper argues for a wage-led recovery and growth program which is the only way to remedy the deep causes of the crisis and escape the jobs crisis. Such a program is the polar opposite of the current policy orthodoxy, showing how much is at stake. Winning the argument for wage-led recovery will require winning the war of ideas about economics that has its roots going back to Keynes’ challenge of classical macroeconomics in the 1920s and 1930s. That will involve showing how the financial crisis and Great Recession were the ultimate result of three decades of neoliberal policy, which produced wage stagnation by severing the wage productivity growth link and made asset price inflation and debt the engine of demand growth in place of wages; showing how wage-led policy resolves the current problem of global demand shortage without pricing out labor; and developing a detailed set of policy proposals that flow from these understandings. The essence of a wage-led policy approach is to rebuild the link between wages and productivity growth, combined with expansionary macroeconomic policy that fills the current demand shortfall so as to push the economy on to a recovery path. Both sets of measures are necessary. Expansionary macro policy (i.e. fiscal stimulus and easy monetary policy) without rebuilding the wage mechanism will not produce sustainable recovery and may end in fiscal crisis. Rebuilding the wage mechanism without expansionary macro policy is likely to leave the economy stuck in the orbit of stagnation.
Resumo:
The present work analyzes the establishment of a startup’s operations and the structuring all of the processes required to start up the business, launch the platform and keep it working. The thesis’ main focus can therefore be described as designing and structuring a startup’s operations in an emerging market before and during its global launch. Such business project aims to provide a successful case regarding the creation of a business and its launch into an emerging market, by illustrating a practical example on how to structure the business’ operations within a limited time frame. Moreover, this work will also perform a complete economic analysis of Brazil, thorough analyses of the industries the company is related to, as well as a competitive analysis of the market the venture operates in. Furthermore, an assessment of the venture’s business model and of its first six-month performance will also be included. The thesis’ ultimate goal lies in evaluating the company’s potential of success in the next few years, by highlighting its strengths and criticalities. On top of providing the company’s management with brilliant findings and forecasts about its own business, the present work will represent a reference and a practical roadmap for any entrepreneur willing to establish his operations in Brazil.
Resumo:
In this dissertation, different ways of combining neural predictive models or neural-based forecasts are discussed. The proposed approaches consider mostly Gaussian radial basis function networks, which can be efficiently identified and estimated through recursive/adaptive methods. Two different ways of combining are explored to get a final estimate – model mixing and model synthesis –, with the aim of obtaining improvements both in terms of efficiency and effectiveness. In the context of model mixing, the usual framework for linearly combining estimates from different models is extended, to deal with the case where the forecast errors from those models are correlated. In the context of model synthesis, and to address the problems raised by heavily nonstationary time series, we propose hybrid dynamic models for more advanced time series forecasting, composed of a dynamic trend regressive model (or, even, a dynamic harmonic regressive model), and a Gaussian radial basis function network. Additionally, using the model mixing procedure, two approaches for decision-making from forecasting models are discussed and compared: either inferring decisions from combined predictive estimates, or combining prescriptive solutions derived from different forecasting models. Finally, the application of some of the models and methods proposed previously is illustrated with two case studies, based on time series from finance and from tourism.
Resumo:
The world has many types of oil that have a range of values of density and viscosity, these are characteristics to identify whether an oil is light, heavy or even ultraheavy. The occurrence of heavy oil has increased significantly and pointing to a need for greater investment in the exploitation of deposits and therefore new methods to recover that oil. There are economic forecasts that by 2025, the heavy oil will be the main source of fossil energy in the world. One such method is the use of solvent vaporized VAPEX which is known as a recovery method which consists of two horizontal wells parallel to each other, with a gun and another producer, which uses as an injection solvent that is vaporized in order to reduce the viscosity of oil or bitumen, facilitating the flow to the producing well. This method was proposed by Dr. Roger Butler, in 1991. The importance of this study is to analyze how the influence some operational reservoir and parameters are important in the process VAPEX, such as accumulation of oil produced in the recovery factor in flow injection and production rate. Parameters such as flow injection, spacing between wells, type of solvent to be injected, vertical permeability and oil viscosity were addressed in this study. The results showed that the oil viscosity is the parameter that showed statistically significant influence, then the choice of Heptane solvent to be injected showed a greater recovery of oil compared to other solvents chosen, considering the spacing between the wells was shown that for a greater distance between the wells to produce more oil
Resumo:
The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period
Resumo:
The increasing demand for energy and the environment consequences derived from the use of fossil energy, beyond the future scarcity of the oil that currently is the main power plant of the world, it stimulated the research around the production of biodiesel. In this work the synthesis of biodiesel of cotton in the methyl route was carried through, for had been in such a way used catalyst commercial homogeneous, Na-Methylat and the K-Methylat, aiming to the evaluation of the efficiency of them. An experimental planning 23 was elaborated aiming to evaluate the influence of the variable (molar reason oil/alcohol, % of catalyst and temperature) in the process as well as indicating the excellent point of operation in each case. The biodiesel was analyzed by gaseous chromatography, indicating a conversion of 96,79% when used Na-Methylat® as catalytic, and 95,65% when the K-Methylat® was used. Optimum result found with regard to the conversion was obtained at the following conditions: molar reason oil/alcohol (1:8), temperature of 40°C and 1% of catalyst Na-Methylat, reaching a 96,79% conversion, being, therefore, above of the established for the European norm (96.5%). The analysis of regression showed that the only significant effect for a confidence level of 95%, was of the changeable temperature. The variance analysis evidenced that the considered model is fitted quite to the experimental response, being statistically significant; however it does not serve inside for make forecasts of the intervals established for each variable. The best samples were analyzed by infra-red (IR) that identified the strong bands of axial deformation C=O of methylic ester, characterized through analyses physicochemical that had indicated conformity with the norms of the ANP, that with the thermal and rheological analyses had together evidenced that biodiesel can be used as combustible alternative in substitution to diesel