884 resultados para Fundação Getúlio Vargas
Resumo:
O trabalho faz um retrospecto das principais discussões durante os primeiros anos da Era Vargas sobre a forma de organização do Poder Judiciário na busca de encontrar as motivações que ensejaram a extinção da Justiça Federal de 1a Instância através da Constituição outorgada em 10 de novembro de 1937. A partir da Revolução de 1930, serão apresentadas as principais correntes acerca do sistema de justiça debatidas durante as sessões da subcomissão do Itamarati, criada para elaboração de anteprojeto constitucional a pedido de Getúlio Vargas, então chefe do Governo Provisório, e também nas sessões da Assembleia Nacional Constituinte de 1934. Valendo-se de fontes primárias como normas legais, atas de sessões, cartas e matérias publicadas em jornal da época, a pesquisa destacará a importância dos debates sobre o Poder Judiciário ocorridos na época para a concepção do Estado Nacional que se encontrava em fase de plena construção. Para compreensão do contexto em que as aludidas fontes primárias estão inseridas, privilegiou-se o uso de trabalhos acadêmicos desenvolvidos na década de 1980, principalmente pelo Centro de Pesquisa e Documentação de História Contemporânea do Brasil (CPDOC), que auxiliam a compreensão de uma fase conturbada do passado recente nacional. O trabalho defende a ideia de que, mais do que questões de cunho administrativo ou doutrinário jurídico, foi o ideário que envolveu a concepção do denominado Estado Novo que criou condições ideológicas e políticas autorizadoras, não consolidadas em momento anterior, e que resultou a não inclusão da Justiça Federal de Primeira Instância entre os órgãos do Poder Judiciário na Constituição de 1937.
Resumo:
This Thesis is the result of my Master Degree studies at the Graduate School of Economics, Getúlio Vargas Foundation, from January 2004 to August 2006. am indebted to my Thesis Advisor, Professor Luiz Renato Lima, who introduced me to the Econometrics' world. In this Thesis, we study time-varying quantile process and we develop two applications, which are presented here as Part and Part II. Each of these parts was transformed in paper. Both papers were submitted. Part shows that asymmetric persistence induces ARCH effects, but the LMARCH test has power against it. On the other hand, the test for asymmetric dynamics proposed by Koenker and Xiao (2004) has correct size under the presence of ARCH errors. These results suggest that the LM-ARCH and the Koenker-Xiao tests may be used in applied research as complementary tools. In the Part II, we compare four different Value-at-Risk (VaR) methodologies through Monte Cario experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust method ologies have higher probability to predict VaRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate VaR for returns of São Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.
Resumo:
Destaca-se, inicialmente, o problema dos subsídios creditícios não representarem necessariamente apenas um incentivo a exportar, mas sim um incentivo a tornar crédito subsidiado, o que mesmo com controles eficientes, pode não representar exatamente a mesma coisa. Sob hipóteses diversas dos trabalhos anteriores sobretudo em relação ao mercado de capitais, constroem-se três variáveis para representar o efeito do crédito subsidiado sobre a exportação.
Resumo:
Neste artigo investigamos a sustentabilidade fiscal no Brasil através de um modelo Quantílico Auto-Regressivo (QAR). Esta metodologia nos permite caracterizar a dinâmica da dívida pública e construir uma medida limite de endividamento, compatível com a sustentabilidade fiscal. Tal limite de endividamento constitui-se um indicador de grande importância para guiar os gestores da dívida pública, de forma a mantê-la sustentável no longo prazo, evitando uma austeridade fiscal excessiva. Nossos resultados indicam que a dívida pública (federal e interna) brasileira é globalmente sustentável a 10% de significância, apesar de ter ultrapassado o limite de endividamento por inúmeras vezes nos dois últimos anos. Por fim, sugerimos uma redução de 5% na razão dívida/PIB até o final de 2006, de forma a garantir a sustentabilidade fiscal do Brasil no longo prazo. Desta forma, apresentamos um arcabouço teórico consistente, e sua respectiva aplicação prática, com o intuito de contribuir para o planejamento estratégico e a gestão da dívida pública no Brasil.
Resumo:
This paper investigates the income inequality generated by a jobsearch process when di§erent cohorts of homogeneous workers are allowed to have di§erent degrees of impatience. Using the fact the average wage under the invariant Markovian distribution is a decreasing function of the discount factor (Cysne (2004, 2006)), I show that the Lorenz curve and the between-cohort Gini coe¢ cient of income inequality can be easily derived in this case. An example with arbitrary measures regarding the wage o§ers and the distribution of time preferences among cohorts provides some insights into how much income inequality can be generated, and into how it varies as a function of the probability of unemployment and of the probability that the worker does not Önd a job o§er each period.
Resumo:
Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
Resumo:
We consider exchange economies with a continuum of agents and differential information about finitely many states of nature. It was proved in Einy, Moreno and Shitovitz (2001) that if we allow for free disposal in the market clearing (feasibility) constraints then an irreducible economy has a competitive (or Walrasian expectations) equilibrium, and moreover, the set of competitive equilibrium allocations coincides with the private core. However when feasibility is defined with free disposal, competitive equilibrium allocations may not be incentive compatible and contracts may not be enforceable (see e.g. Glycopantis, Muir and Yannelis (2002)). This is the main motivation for considering equilibrium solutions with exact feasibility. We first prove that the results in Einy et al. (2001) are still valid without freedisposal. Then we define an incentive compatibility property motivated by the issue of contracts’ execution and we prove that every Pareto optimal exact feasible allocation is incentive compatible, implying that contracts of competitive or core allocations are enforceable.
Resumo:
Este trabalho apresenta os modelos clássicos e neoclássicos desenvolvido por Jorgenson. Procura suprir alguns detalhes que foram omitidos e resumir o argumento principal que é exposto com riqueza de informações. Visa, desta forma, facilitar a leitura dos trabalhos do celebrado autor sobre economia dual.
Resumo:
O que faz com que, nos modelos de negociação com informações assimétricas no mercado de capitais, haja compra e venda de ativos? Seria o fato de as informações não serem as mesmas para todos os agentes que atuam na bolsa de valores o motivo da especulação? Faz sentido falar em compra e venda de ativos, quando os diversos agentes que compõem o mercado agem de forma racional e sabem que todos os demais agem assim? A grande motivação para a formulação deste trabalho foi a de que todos os artigos desenvolvidos até os dias de hoje - sobre equilíbrio com negociação em mercados de capitais - consideram de alguma maneira a presença de comportamento irracional de algum agente ou do mercado como um todo. Ver, por exemplo, os modelos apresentados em Kyle (1985) e Glosten e Milgrom (1985), onde a irracional idade existe no comportamento dos investidores denominados aleat6rios. Tais aplicadores demandam ativos de maneira aleat6ria, ou seja, não possuem uma estratégia que determine os seus desejos de compra E venda de ações. O que nos causou muita estranheza foi o fato de serem modelos de expectativas racionais, isto é, existe urna hip6tese de racionalidade entre OS indivíduos que negociam no setor financeiro. Portanto, a presença dos investidores aleat6rios torna esses trabalhos inconsistentes. O objetivo deste capítulo: retirar esses investidores aleatórios do mercado e, com isso, descobrir se sem a presença deles existir um ponto de alocação Pareto superior com a negociação.
Resumo:
Lucas (1987) has shown the surprising result that the welfare cost of business cycles is quite small. Using standard assumptions on preferences and a fully-áedged econometric model we computed the welfare costs of macroeconomic uncertainty for the post-WWII era using the multivariate Beveridge-Nelson decomposition for trends and cycles, which considers not only business-cycle uncertainty but also uncertainty from the stochastic trend in consumption. The post-WWII period is relatively quiet, with the welfare costs of uncertainty being about 0:9% of per-capita consumption. Although changing the decomposition method changed substantially initial results, the welfare cost of uncertainty is qualitatively small in the post-WWII era - about $175.00 a year per-capita in the U.S. We also computed the marginal welfare cost of macroeconomic uncertainty using this same technique. It is about twice as large as the welfare cost ñ$350.00 a year per-capita.
Resumo:
After more than forty years studying growth, there are two classes of growth models that have emerged: exogenous and endogenous growth models. Since both try to mimic the same set of long-run stylized facts, they are observationally equivalent in some respects. Our goals in this paper are twofold First, we discuss the time-series properties of growth models in a way that is useful for assessing their fit to the data. Second, we investigate whether these two models successfully conforms to U.S. post-war data. We use cointegration techniques to estimate and test long-run capital elasticities, exogeneity tests to investigate the exogeneity status of TFP, and Granger-causality tests to examine temporal precedence of TFP with respect to infrastructure expenditures. The empirical evidence is robust in confirming the existence of a unity long-run capital elasticity. The analysis of TFP reveals that it is not weakly exogenous in the exogenous growth model Granger-causality test results show unequivocally that there is no evidence that TFP for both models precede infrastructure expenditures not being preceded by it. On the contrary, we find some evidence that infras- tructure investment precedes TFP. Our estimated impact of infrastructure on TFP lay rougbly in the interval (0.19, 0.27).
Resumo:
Our work is based on a simpliÖed heterogenous-agent shoppingtime economy in which economic agents present distinct productivities in the production of the consumption good, and di§erentiated access to transacting assets. The purpose of the model is to investigate whether, by focusing the analysis solely on endogenously determined shopping times, one can generate a positive correlation between ináation and income inequality. Our main result is to show that, provided the productivity of the interest-bearing asset in the transacting technology is high enough, it is true true that a positive link between ináation and income inequality is generated. Our next step is to show, through analysis of the steady-state equations, that our approach can be interpreted as a mirror image of the usual ináation-tax argument for income concentration. An example is o§ered to illustrate the mechanism.