4 resultados para OPTIMIZATION MODEL
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
This paper investigates an intertemporal optimization model in order to analyze the current account of the G-7 countries, measured as the present value of the future changes in net output. The study compares observed and forecasted series, generated by the model, using Campbell & Shiller’s (1987) methodology. In the estimation process, the countries are considered separately (with OLS technique) as well as jointly (SURE approach), to capture contemporaneous correlations of the shocks in net output. The paper also proposes a note on Granger causality and its implications to the optimal current account. The empirical results are sensitive to the technique adopted in the estimation process and suggest a rejection of the model in the G-7 countries, except for the USA and Japan, according to some papers presented in the literature.
Resumo:
This thesis is composed of three essays referent to the subjects of macroeconometrics and Önance. In each essay, which corresponds to one chapter, the objective is to investigate and analyze advanced econometric techniques, applied to relevant macroeconomic questions, such as the capital mobility hypothesis and the sustainability of public debt. A Önance topic regarding portfolio risk management is also investigated, through an econometric technique used to evaluate Value-at-Risk models. The Örst chapter investigates an intertemporal optimization model to analyze the current account. Based on Campbell & Shillerís (1987) approach, a Wald test is conducted to analyze a set of restrictions imposed to a VAR used to forecast the current account. The estimation is based on three di§erent procedures: OLS, SUR and the two-way error decomposition of Fuller & Battese (1974), due to the presence of global shocks. A note on Granger causality is also provided, which is shown to be a necessary condition to perform the Wald test with serious implications to the validation of the model. An empirical exercise for the G-7 countries is presented, and the results substantially change with the di§erent estimation techniques. A small Monte Carlo simulation is also presented to investigate the size and power of the Wald test based on the considered estimators. The second chapter presents a study about Öscal sustainability based on a quantile autoregression (QAR) model. A novel methodology to separate periods of nonstationarity from stationary ones is proposed, which allows one to identify trajectories of public debt that are not compatible with Öscal sustainability. Moreover, such trajectories are used to construct a debt ceiling, that is, the largest value of public debt that does not jeopardize long-run Öscal sustainability. An out-of-sample forecast of such a ceiling is also constructed, and can be used by policy makers interested in keeping the public debt on a sustainable path. An empirical exercise by using Brazilian data is conducted to show the applicability of the methodology. In the third chapter, an alternative backtest to evaluate the performance of Value-at-Risk (VaR) models is proposed. The econometric methodology allows one to directly test the overall performance of a VaR model, as well as identify periods of an increased risk exposure, which seems to be a novelty in the literature. Quantile regressions provide an appropriate environment to investigate VaR models, since they can naturally be viewed as a conditional quantile function of a given return series. An empirical exercise is conducted for daily S&P500 series, and a Monte Carlo simulation is also presented, revealing that the proposed test might exhibit more power in comparison to other backtests.
Resumo:
On using McKenzie’s taxonomy of optimal accumulation in the longrun, we report a “uniform turnpike” theorem of the third kind in a model original to Robinson, Solow and Srinivasan (RSS), and further studied by Stiglitz. Our results are presented in the undiscounted, discrete-time setting emphasized in the recent work of Khan-Mitra, and they rely on the importance of strictly concave felicity functions, or alternatively, on the value of a “marginal rate of transformation”, ξσ, from one period to the next not being unity. Our results, despite their specificity, contribute to the methodology of intertemporal optimization theory, as developed in economics by Ramsey, von Neumann and their followers.
Resumo:
The search for efficiency in supply chains has usually focused on logistic optimization aspects. Initiatives like the ECR are an example. This research questions the appropriateness of this focus comparing detailed cost structures of fifteen consumer products, covering five different product categories. It compares supply chains of private label products, presumably more efficient due to closer collaboration between chain members, to national brands supply chains. The major source of cost differences lies in other indirect costs incurred by the national brands and not directly assignable to advertising. Results indicate that a complete reconception of the supply chain, exploring different governance structures offers greater opportunities for cost savings than the logistic aspect in isolation. Research was done in the UK in 1995-1997, but results are only now publishable due to confidentiality agreements