16 resultados para Two-step langmuir model
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian inflation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in different measures of forecasting accuracy are substantial, especially for short horizons.
Resumo:
We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We consider model selection criteria which have data-dependent penalties as well as the traditional ones. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. Our Monte Carlo simulations measure the improvements in forecasting accuracy that can arise from the joint determination of lag-length and rank using our proposed procedure, relative to an unrestricted VAR or a cointegrated VAR estimated by the commonly used procedure of selecting the lag-length only and then testing for cointegration. Two empirical applications forecasting Brazilian in ation and U.S. macroeconomic aggregates growth rates respectively show the usefulness of the model-selection strategy proposed here. The gains in di¤erent measures of forecasting accuracy are substantial, especially for short horizons.
Resumo:
We study the joint determination of the lag length, the dimension of the cointegrating space and the rank of the matrix of short-run parameters of a vector autoregressive (VAR) model using model selection criteria. We suggest a new two-step model selection procedure which is a hybrid of traditional criteria and criteria with data-dependant penalties and we prove its consistency. A Monte Carlo study explores the finite sample performance of this procedure and evaluates the forecasting accuracy of models selected by this procedure. Two empirical applications confirm the usefulness of the model selection procedure proposed here for forecasting.
Resumo:
We characterize optimal policy in a two-sector growth model with xed coeÆcients and with no discounting. The model is a specialization to a single type of machine of a general vintage capital model originally formulated by Robinson, Solow and Srinivasan, and its simplicity is not mirrored in its rich dynamics, and which seem to have been missed in earlier work. Our results are obtained by viewing the model as a specific instance of the general theory of resource allocation as initiated originally by Ramsey and von Neumann and brought to completion by McKenzie. In addition to the more recent literature on chaotic dynamics, we relate our results to the older literature on optimal growth with one state variable: speci cally, to the one-sector setting of Ramsey, Cass and Koopmans, as well as to the two-sector setting of Srinivasan and Uzawa. The analysis is purely geometric, and from a methodological point of view, our work can be seen as an argument, at least in part, for the rehabilitation of geometric methods as an engine of analysis.
Resumo:
In this paper, we propose a two-step estimator for panel data models in which a binary covariate is endogenous. In the first stage, a random-effects probit model is estimated, having the endogenous variable as the left-hand side variable. Correction terms are then constructed and included in the main regression.
Resumo:
Most studies around that try to verify the existence of regulatory risk look mainly at developed countries. Looking at regulatory risk in emerging market regulated sectors is no less important to improving and increasing investment in those markets. This thesis comprises three papers comprising regulatory risk issues. In the first Paper I check whether CAPM betas capture information on regulatory risk by using a two-step procedure. In the first step I run Kalman Filter estimates and then use these estimated betas as inputs in a Random-Effect panel data model. I find evidence of regulatory risk in electricity, telecommunications and all regulated sectors in Brazil. I find further evidence that regulatory changes in the country either do not reduce or even increase the betas of the regulated sectors, going in the opposite direction to the buffering hypothesis as proposed by Peltzman (1976). In the second Paper I check whether CAPM alphas say something about regulatory risk. I investigate a methodology similar to those used by some regulatory agencies around the world like the Brazilian Electricity Regulatory Agency (ANEEL) that incorporates a specific component of regulatory risk in setting tariffs for regulated sectors. I find using SUR estimates negative and significant alphas for all regulated sectors especially the electricity and telecommunications sectors. This runs in the face of theory that predicts alphas that are not statistically different from zero. I suspect that the significant alphas are related to misspecifications in the traditional CAPM that fail to capture true regulatory risk factors. On of the reasons is that CAPM does not consider factors that are proven to have significant effects on asset pricing, such as Fama and French size (ME) and price-to-book value (ME/BE). In the third Paper, I use two additional factors as controls in the estimation of alphas, and the results are similar. Nevertheless, I find evidence that the negative alphas may be the result of the regulated sectors premiums associated with the three Fama and French factors, particularly the market risk premium. When taken together, ME and ME/BE regulated sectors diminish the statistical significance of market factors premiums, especially for the electricity sector. This show how important is the inclusion of these factors, which unfortunately is scarce in emerging markets like Brazil.
Resumo:
The aim of this article is to assess the role of real effective exchange rate volatility on long-run economic growth for a set of 82 advanced and emerging economies using a panel data set ranging from 1970 to 2009. With an accurate measure for exchange rate volatility, the results for the two-step system GMM panel growth models show that a more (less) volatile RER has significant negative (positive) impact on economic growth and the results are robust for different model specifications. In addition to that, exchange rate stability seems to be more important to foster long-run economic growth than exchange rate misalignment
Resumo:
We study semiparametric two-step estimators which have the same structure as parametric doubly robust estimators in their second step. The key difference is that we do not impose any parametric restriction on the nuisance functions that are estimated in a first stage, but retain a fully nonparametric model instead. We call these estimators semiparametric doubly robust estimators (SDREs), and show that they possess superior theoretical and practical properties compared to generic semiparametric two-step estimators. In particular, our estimators have substantially smaller first-order bias, allow for a wider range of nonparametric first-stage estimates, rate-optimal choices of smoothing parameters and data-driven estimates thereof, and their stochastic behavior can be well-approximated by classical first-order asymptotics. SDREs exist for a wide range of parameters of interest, particularly in semiparametric missing data and causal inference models. We illustrate our method with a simulation exercise.
Resumo:
A tradicional representação da estrutura a termo das taxas de juros em três fatores latentes (nível, inclinação e curvatura) teve sua formulação original desenvolvida por Charles R. Nelson e Andrew F. Siegel em 1987. Desde então, diversas aplicações vêm sendo desenvolvidas por acadêmicos e profissionais de mercado tendo como base esta classe de modelos, sobretudo com a intenção de antecipar movimentos nas curvas de juros. Ao mesmo tempo, estudos recentes como os de Diebold, Piazzesi e Rudebusch (2010), Diebold, Rudebusch e Aruoba (2006), Pooter, Ravazallo e van Dijk (2010) e Li, Niu e Zeng (2012) sugerem que a incorporação de informação macroeconômica aos modelos da ETTJ pode proporcionar um maior poder preditivo. Neste trabalho, a versão dinâmica do modelo Nelson-Siegel, conforme proposta por Diebold e Li (2006), foi comparada a um modelo análogo, em que são incluídas variáveis exógenas macroeconômicas. Em paralelo, foram testados dois métodos diferentes para a estimação dos parâmetros: a tradicional abordagem em dois passos (Two-Step DNS), e a estimação com o Filtro de Kalman Estendido, que permite que os parâmetros sejam estimados recursivamente, a cada vez que uma nova informação é adicionada ao sistema. Em relação aos modelos testados, os resultados encontrados mostram-se pouco conclusivos, apontando uma melhora apenas marginal nas estimativas dentro e fora da amostra quando as variáveis exógenas são incluídas. Já a utilização do Filtro de Kalman Estendido mostrou resultados mais consistentes quando comparados ao método em dois passos para praticamente todos os horizontes de tempo estudados.
Resumo:
The Import Substitution Process in Latin Amer ica was an attempt to enhance GDP growth and productivity by rising trade barriers upon capital-intensive products. Our main goal is to analyze how an increase in import tariff on a particular type of good affects the production choices and trade pattern of an economy. We develop an extension of the dynamic Heckscher-Ohlin model – a combination of a static two goods, two-factor Heckscher-Ohlin model and a two-sector growth model – allowing for import tariff. We then calibrate the closed economy model to the US. The results show that the economy will produce less of both consumption and investment goods under autarky for low and high levels of capital stock per worker. We also find that total GDP may be lower under free trade in comparison to autarky.
Resumo:
This paper argues that monetary models can and usually present the phenomenon of over-banking; that is, the market solution of the model presents a size of the banking sector which is higher than the social optima. Applying a two sector monetary model of capital accumulation in presence of a banking sector, which supplies liquidity services, it is shown that the rise of a tax that disincentives the acquisition of the banking service presents the following impacts on welfare. If the technology is the same among the sectors, the tax increases welfare; otherwise, steady-state utility increase if the banking sector is labor-intensive compared to the real sector. Additionally, it is proved that the elevation of inflation has the following impact on the economy's equilibrium: the share on the product of the banking sector increases; the product and the stock of capital increases or reduces whether the banking sector is capital-intensive or laborintensive; and, the steady-state utility reduces. The results were derived under a quite general set up - standard hypothesis regarding concavity of preference, convexity of technology, and normality of goods - were required.
Resumo:
In this dissertation, we investigate the effect of foreign capital participations in Brazilians companies’ performance. To carry out this analysis, we constructed two sets of model based on EBITDA margin and return on equity. Panel data analysis is used to examine the relationship between foreign capital ownership and Brazilian firms’ performance. We construct a cross-section time-series sample of companies listed on the BOVESPA index from 2006 to 2010. Empirical results led us to validate two hypotheses. First, foreign capital participations improve companies’ performance up to a certain level of participation. Then, joint controlled or strategic partnership between a Brazilian company and a foreign investor provide high operating performance.
Resumo:
This paper proposes a two-step procedure to back out the conditional alpha of a given stock using high-frequency data. We rst estimate the realized factor loadings of the stocks, and then retrieve their conditional alphas by estimating the conditional expectation of their risk-adjusted returns. We start with the underlying continuous-time stochastic process that governs the dynamics of every stock price and then derive the conditions under which we may consistently estimate the daily factor loadings and the resulting conditional alphas. We also contribute empiri-cally to the conditional CAPM literature by examining the main drivers of the conditional alphas of the S&P 100 index constituents from January 2001 to December 2008. In addition, to con rm whether these conditional alphas indeed relate to pricing errors, we assess the performance of both cross-sectional and time-series momentum strategies based on the conditional alpha estimates. The ndings are very promising in that these strategies not only seem to perform pretty well both in absolute and relative terms, but also exhibit virtually no systematic exposure to the usual risk factors (namely, market, size, value and momentum portfolios).
Resumo:
As peculiaridades da atividade bancária - normalmente vista como fundamental à persecução do desenvolvimento, bem como bastante influenciada pelo direito - estimularam a emergência de um regime internacional de regulação da categoria. Tal advento se deu na esteira dos trabalhos realizados por organizações internacionais, como o Comitê da Basileia (BCBS) e o Comitê de Estabilidade Financeira (FSB), e em virtude da percepção de estarmos em um mundo no qual os mercados estão muito interligados, mas permanecem nacionalmente regulados. À parte da discussão do mérito e efetividade dos padrões regulatórios propostos por essas organizações, em um contexto no qual uma série de países busca implementá-los, interessa ao presente trabalho perscrutar os elementos que definem o grau adequado de discricionariedade de implementação conferida na formulação desses. A análise de tal problema sugere a existência de dois extremos a se evitar: a arbitragem regulatória e o one size fits all. Evitar a arbitragem regulatória é uma preocupação da literatura de regulação bancária que se traduz em conter uma variação muito acentuada entre os regimes regulatórios de diferentes jurisdições. Isso enseja três vetores favoráveis a um menor grau de discricionariedade, representado por desígnios de maior coordenação, maior competitividade e de evitar uma race to the bottom regulatória entre os países. Já evitar o one size fits all é uma preocupação recorrente da literatura de direito e desenvolvimento que sugere a necessidade de se atentar para as peculiaridades locais na formulação de políticas regulatórias. Por sua vez, isso enseja outros três vetores, dessa vez em direção a um maior grau de discricionariedade. Sendo esses representados por preocupações com a eficiência das medidas adotadas, com a garantia de um espaço de manobra que respeite a autodeterminação dos países - ao menos minorando eventuais déficits democráticos da estipulação de padrões internacionais - e com a viabilidade prática do experimentalismo. A fim de analisar esse problema e levando em conta esses extremos, propõe-se uma estratégia bipartida: a construção de um enquadramento teórico e a verificação de uma hipótese de pesquisa, segundo a qual um caso específico de regulação bancária pode demonstrar como esses elementos interagem na definição do grau de discricionariedade. Assim, em um primeiro momento - após a necessária contextualização e descrição metodológica - é construído um framework teórico do problema à luz da literatura da regulação bancária e do instrumental utilizado pelas discussões acerca do impacto do direito no desenvolvimento. Discussões essas que há anos têm abordado a formulação de padrões internacionais e a sua implementação em contextos nacionais diversos. Também nesse primeiro momento e como parte da construção dos alicerces teóricos, procede-se a um excurso que busca verificar a hipótese da confiança no sistema bancário ser uma espécie de baldio (common), bem como suas possíveis consequências. Partindo desse enquadramento, elege-se o segmento de regulação bancária relativo aos garantidores de depósito para uma análise de caso. Tal análise - realizada com subsídios provenientes de pesquisa bibliográfica e empírica - busca demonstrar com que grau de discricionariedade e de que forma se deu a formulação e implementação de padrões internacionais nesse segmento. Ao fim, analisa-se como os vetores determinantes do grau de discricionariedade interagem no caso dos garantidores de depósitos, bem como as sugestões possivelmente inferíveis dessa verificação para os demais segmentos da regulação bancária.
Resumo:
Este trabalho tem o objetivo de testar a qualidade preditiva do Modelo Vasicek de dois fatores acoplado ao Filtro de Kalman. Aplicado a uma estratégia de investimento, incluímos um critério de Stop Loss nos períodos que o modelo não responde de forma satisfatória ao movimento das taxas de juros. Utilizando contratos futuros de DI disponíveis na BMFBovespa entre 01 de março de 2007 a 30 de maio de 2014, as simulações foram realizadas em diferentes momentos de mercado, verificando qual a melhor janela para obtenção dos parâmetros dos modelos, e por quanto tempo esses parâmetros estimam de maneira ótima o comportamento das taxas de juros. Os resultados foram comparados com os obtidos pelo Modelo Vetor-auto regressivo de ordem 1, e constatou-se que o Filtro de Kalman aplicado ao Modelo Vasicek de dois fatores não é o mais indicado para estudos relacionados a previsão das taxas de juros. As limitações desse modelo o restringe em conseguir estimar toda a curva de juros de uma só vez denegrindo seus resultados.