91 resultados para dynamic stochastic general equilibrium models
Resumo:
O objetivo deste estudo é avaliar, por meio de um modelo de equilíbrio geral computável, multi-setorial e multi-regional, os impactos de uma redução das tarifas dos bens não agrícolas sobre a economia brasileira, a partir da Fórmula Suíça, com diferentes coeficientes. O modelo de equilíbrio geral utilizado é o Global Trade Analysis Project (GTAP) e os cortes de tarifas foram estimados a partir de dados do MAcMap. Além dos impactos macroeconômicos e setoriais, testou-se a sensibilidade do modelo ao aumento das elasticidades de Armington e à implementação de liberalização tarifária agrícola.
Resumo:
Neste artigo é apresentada uma avaliação dos impacto da reforma tributária do PIS/PASEP e da COFINS, que passaram a ser coletados através de dois regimes associados aos fluxos domésticos (cumulativo e não cumulativo - misto) e a incidir sobre importações de bens e serviços. A metodologia adotada utiliza um modelo de Equilíbrio Geral Computável (CGE), adaptado para as novas características do sistema fiscal e especificado para simular os impactos sobre indicadores de bem-estar no Brasil. Estes impactos foram avaliados em duas etapas: a mudança do regime cumulativo para o novo regime tributário e a reforma completa. Os resultados mostram que esta reforma teria provocado deterioração dos indicadores macroeconômicos, de mercado de trabalho e de bem-estar.
Resumo:
Usando a abordagem de competitive search, modelo um mercado de trabalho com trabalhadores heterogêneos no qual há um problema de risco moral na relação entre firmas e trabalhadores. Nesse contexto, consigo prever como contratos reagem a mudanças nos parâmetros do mercado (em particular, o risco de produção), assim como a variação da probabilidade dos trabalhadores serem contratados. Minha contribuição principal é ver que, no nível individual, existe uma relação negativa entre risco e incentivos, mas efeitos de equilíbrio geral implicam que essa relação pode ser positiva no nível agregado. Esse resultado ajuda a esclarecer resultados empíricos contraditórios sobre a relação entre risco e incentivos.
Resumo:
Este artigo analisa a importância dos atrasos portuários para a competitividade da indústria de transformação no Brasil. Com base em estimativas recentes sobre o custo diário dos atrasos comerciais e em bases de dados do Banco Mundial e do GTAP (Global Trade Analysis Project), revela a magnitude destas barreiras sob a forma de seus equivalentes ad valorem. Em seguida, por meio de simulações em equilíbrio geral, estima o impacto da melhoria dos processos aduaneiros sobre o desempenho da indústria de transformação no Brasil, sob diferentes cenários. Os resultados obtidos ressaltam o caráter estratégico da facilitação do comércio para o Brasil e da sua inclusão como item relevante para a agenda de crescimento de longo prazo do país
Resumo:
We study the impact of the different stages of human capital accumulation on the evolution of labor productivity in a model calibrated to the U.S. from 1961 to 2008. We add early childhood education to a standard continuous time life cycle economy and assume complementarity between educational stages. There are three sectors in the model: the goods sector, the early childhood sector and the formal education sector. Agents are homogenous and choose the intensity of preschool education, how long to stay in formal school, labor effort and consumption, and there are exogenous distortions to these four decisions. The model matches the data very well and closely reproduces the paths of schooling, hours worked, relative prices and GDP. We find that the reduction in distortions to early education in the period was large and made a very strong contribution to human capital accumulation. However, due to general equilibrium effects of labor market taxation, marginal modification in the incentives for early education in 2008 had a smaller impact than those for formal education. This is because the former do not decisively affect the decision to join the labor market, while the latter do. Without labor taxation, incentives for preschool are significantly stronger.
Resumo:
This article investigates the impact on the U.S. economy of making health care more affordable. We compare health care cost reductions with the Patient Protection and Affordable Care Act (PPACA) using a rich life cycle general equilibrium model with heterogeneous agents. We found that all policies were able to reduce uninsured population, but the PPACA was the most effective: in the long run, less than 5% of Americans would remain uninsured. Cost reductions alleviated the government budget, while tax hikes were needed to finance the reform. Feasible cost reductions are less welfare improving than the PPACA.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
In this article we use factor models to describe a certain class of covariance structure for financiaI time series models. More specifical1y, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. We build on previous work by allowing the factor loadings, in the factor mo deI structure, to have a time-varying structure and to capture changes in asset weights over time motivated by applications with multi pIe time series of daily exchange rates. We explore and discuss potential extensions to the models exposed here in the prediction area. This discussion leads to open issues on real time implementation and natural model comparisons.
Resumo:
The past decade has wítenessed a series of (well accepted and defined) financial crises periods in the world economy. Most of these events aI,"e country specific and eventually spreaded out across neighbor countries, with the concept of vicinity extrapolating the geographic maps and entering the contagion maps. Unfortunately, what contagion represents and how to measure it are still unanswered questions. In this article we measure the transmission of shocks by cross-market correlation\ coefficients following Forbes and Rigobon's (2000) notion of shift-contagion,. Our main contribution relies upon the use of traditional factor model techniques combined with stochastic volatility mo deIs to study the dependence among Latin American stock price indexes and the North American indexo More specifically, we concentrate on situations where the factor variances are modeled by a multivariate stochastic volatility structure. From a theoretical perspective, we improve currently available methodology by allowing the factor loadings, in the factor model structure, to have a time-varying structure and to capture changes in the series' weights over time. By doing this, we believe that changes and interventions experienced by those five countries are well accommodated by our models which learns and adapts reasonably fast to those economic and idiosyncratic shocks. We empirically show that the time varying covariance structure can be modeled by one or two common factors and that some sort of contagion is present in most of the series' covariances during periods of economical instability, or crisis. Open issues on real time implementation and natural model comparisons are thoroughly discussed.
Resumo:
Em modelos de competição de preços, somente um custo de procura positivo por parte do consumidor não gera equilíbrio com dispersão de preços. Já modelos dinâmicos de switching cost consistentemente geram este fenômeno bastante documentado para preços no varejo. Embora ambas as literaturas sejam vastas, poucos modelos tentaram combinar as duas fricções em um só modelo. Este trabalho apresenta um modelo dinâmico de competição de preços em que consumidores idênticos enfrentam custos de procura e de switching. O equilíbrio gera dispersão nos preços. Ainda, como os consumidores são obrigados a se comprometer com uma amostra fixa de firmas antes dos preços serem definidos, somente dois preços serão considerados antes de cada compra. Este resultado independe do tamanho do custo de procura individual do consumidor.
Resumo:
In this paper we construct sunspot equilibria that arrise from chaotic deterministic dynamics. These equilibria are robust and therefore observables. We prove that they may be learned by a sim pie rule based on the histograms or past state variables. This work gives the theoretical justification or deterministic models that might compete with stochastic models to explain real data.
Resumo:
This paper develops a methodology for testing the term structure of volatility forecasts derived from stochastic volatility models, and implements it to analyze models of S&P500 index volatility. U sing measurements of the ability of volatility models to hedge and value term structure dependent option positions, we fmd that hedging tests support the Black-Scholes delta and gamma hedges, but not the simple vega hedge when there is no model of the term structure of volatility. With various models, it is difficult to improve on a simple gamma hedge assuming constant volatility. Ofthe volatility models, the GARCH components estimate of term structure is preferred. Valuation tests indicate that all the models contain term structure information not incorporated in market prices.
Resumo:
This paper analyses an overlapping generations model with absolute bequest motive. It is shown that the widely accepted criterion to verify dynamic efficiency does not apply to this case. In our model the social planner maximizes welfare by choosing a capital stock larger than the golden role and a real rate of interest smaller than the rate of growth of the economy.
Resumo:
In this paper we study the dynamic hedging problem using three different utility specifications: stochastic differential utility, terminal wealth utility, and we propose a particular utility transformation connecting both previous approaches. In all cases, we assume Markovian prices. Stochastic differential utility, SDU, impacts the pure hedging demand ambiguously, but decreases the pure speculative demand, because risk aversion increases. We also show that consumption decision is, in some sense, independent of hedging decision. With terminal wealth utility, we derive a general and compact hedging formula, which nests as special all cases studied in Duffie and Jackson (1990). We then show how to obtain their formulas. With the third approach we find a compact formula for hedging, which makes the second-type utility framework a particular case, and show that the pure hedging demand is not impacted by this specification. In addition, with CRRA- and CARA-type utilities, the risk aversion increases and, consequently the pure speculative demand decreases. If futures price are martingales, then the transformation plays no role in determining the hedging allocation. We also derive the relevant Bellman equation for each case, using semigroup techniques.
Resumo:
Asset allocation decisions and value at risk calculations rely strongly on volatility estimates. Volatility measures such as rolling window, EWMA, GARCH and stochastic volatility are used in practice. GARCH and EWMA type models that incorporate the dynamic structure of volatility and are capable of forecasting future behavior of risk should perform better than constant, rolling window volatility models. For the same asset the model that is the ‘best’ according to some criterion can change from period to period. We use the reality check test∗ to verify if one model out-performs others over a class of re-sampled time-series data. The test is based on re-sampling the data using stationary bootstrapping. For each re-sample we check the ‘best’ model according to two criteria and analyze the distribution of the performance statistics. We compare constant volatility, EWMA and GARCH models using a quadratic utility function and a risk management measurement as comparison criteria. No model consistently out-performs the benchmark.