13 resultados para Millionaire Problem, Efficiency, Verifiability, Zero Test, Batch Equation
em Reposit
Resumo:
Esta dissertação se propõe ao estudo de inferência usando estimação por método generalizado dos momentos (GMM) baseado no uso de instrumentos. A motivação para o estudo está no fato de que sob identificação fraca dos parâmetros, a inferência tradicional pode levar a resultados enganosos. Dessa forma, é feita uma revisão dos mais usuais testes para superar tal problema e uma apresentação dos arcabouços propostos por Moreira (2002) e Moreira & Moreira (2013), e Kleibergen (2005). Com isso, o trabalho concilia as estatísticas utilizadas por eles para realizar inferência e reescreve o teste score proposto em Kleibergen (2005) utilizando as estatísticas de Moreira & Moreira (2013), e é obtido usando a teoria assintótica em Newey & McFadden (1984) a estatística do teste score ótimo. Além disso, mostra-se a equivalência entre a abordagem por GMM e a que usa sistema de equações e verossimilhança para abordar o problema de identificação fraca.
Resumo:
A contractive method for computing stationary solutions of intertemporal equilibrium models is provide. The method is is implemented using a contraction mapping derived from the first-order conditions. The deterministic dynamic programming problem is used to illustrate the method. Some numerical examples are performed.
Resumo:
Consumption is an important macroeconomic aggregate, being about 70% of GNP. Finding sub-optimal behavior in consumption decisions casts a serious doubt on whether optimizing behavior is applicable on an economy-wide scale, which, in turn, challenge whether it is applicable at all. This paper has several contributions to the literature on consumption optimality. First, we provide a new result on the basic rule-of-thumb regression, showing that it is observational equivalent to the one obtained in a well known optimizing real-business-cycle model. Second, for rule-of-thumb tests based on the Asset-Pricing Equation, we show that the omission of the higher-order term in the log-linear approximation yields inconsistent estimates when lagged observables are used as instruments. However, these are exactly the instruments that have been traditionally used in this literature. Third, we show that nonlinear estimation of a system of N Asset-Pricing Equations can be done efficiently even if the number of asset returns (N) is high vis-a-vis the number of time-series observations (T). We argue that efficiency can be restored by aggregating returns into a single measure that fully captures intertemporal substitution. Indeed, we show that there is no reason why return aggregation cannot be performed in the nonlinear setting of the Pricing Equation, since the latter is a linear function of individual returns. This forms the basis of a new test of rule-of-thumb behavior, which can be viewed as testing for the importance of rule-of-thumb consumers when the optimizing agent holds an equally-weighted portfolio or a weighted portfolio of traded assets. Using our setup, we find no signs of either rule-of-thumb behavior for U.S. consumers or of habit-formation in consumption decisions in econometric tests. Indeed, we show that the simple representative agent model with a CRRA utility is able to explain the time series data on consumption and aggregate returns. There, the intertemporal discount factor is significant and ranges from 0.956 to 0.969 while the relative risk-aversion coefficient is precisely estimated ranging from 0.829 to 1.126. There is no evidence of rejection in over-identifying-restriction tests.
Resumo:
It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.
Resumo:
Neste trabalho abordamos a unitização como uma reinterpretação de cartel, partindo do modelo clássico de Green e Porter. A incerteza geológica é representada por um componente estocástico no custo marginal. Caracterizamos o contrato ótimo e, a partir da estática comparativa, avaliamos a eficiência e a viabilidade da cooperação. O preço e o grau da externalidade afetam positivamente o nível de eficiência do contrato ótimo. Mas enquanto preços elevados viabilizam os acordos, o grau de externalidade elevado pode conduzir a equilíbrios ineficientes ou mesmo inviabilizar a produção. O mesmo resultado ocorre com os custos fixos. Adicionalmente, quanto maior for o número de firmas envolvidas no acordo, menor será a chance de existir um contrato mais eficiente que a regra da captura.
Resumo:
Local provision of public services has the positive effect of increasing the efficiency because each locality has its idiosyncrasies that determine a particular demand for public services. This dissertation addresses different aspects of the local demand for public goods and services and their relationship with political incentives. The text is divided in three essays. The first essay aims to test the existence of yardstick competition in education spending using panel data from Brazilian municipalities. The essay estimates two-regime spatial Durbin models with time and spatial fixed effects using maximum likelihood, where the regimes represent different electoral and educational accountability institutional settings. First, it is investigated whether the lame duck incumbents tend to engage in less strategic interaction as a result of the impossibility of reelection, which lowers the incentives for them to signal their type (good or bad) to the voters by mimicking their neighbors’ expenditures. Additionally, it is evaluated whether the lack of electorate support faced by the minority governments causes the incumbents to mimic the neighbors’ spending to a greater extent to increase their odds of reelection. Next, the essay estimates the effects of the institutional change introduced by the disclosure on April 2007 of the Basic Education Development Index (known as IDEB) and its goals on the strategic interaction at the municipality level. This institutional change potentially increased the incentives for incumbents to follow the national best practices in an attempt to signal their type to voters, thus reducing the importance of local information spillover. The same model is also tested using school inputs that are believed to improve students’ performance in place of education spending. The results show evidence for yardstick competition in education spending. Spatial auto-correlation is lower among the lame ducks and higher among the incumbents with minority support (a smaller vote margin). In addition, the institutional change introduced by the IDEB reduced the spatial interaction in education spending and input-setting, thus diminishing the importance of local information spillover. The second essay investigates the role played by the geographic distance between the poor and non-poor in the local demand for income redistribution. In particular, the study provides an empirical test of the geographically limited altruism model proposed in Pauly (1973), incorporating the possibility of participation costs associated with the provision of transfers (Van de Wale, 1998). First, the discussion is motivated by allowing for an “iceberg cost” of participation in the programs for the poor individuals in Pauly’s original model. Next, using data from the 2000 Brazilian Census and a panel of municipalities based on the National Household Sample Survey (PNAD) from 2001 to 2007, all the distance-related explanatory variables indicate that an increased proximity between poor and non-poor is associated with better targeting of the programs (demand for redistribution). For instance, a 1-hour increase in the time spent commuting by the poor reduces the targeting by 3.158 percentage points. This result is similar to that of Ashworth, Heyndels and Smolders (2002) but is definitely not due to the program leakages. To empirically disentangle participation costs and spatially restricted altruism effects, an additional test is conducted using unique panel data based on the 2004 and 2006 PNAD, which assess the number of benefits and the average benefit value received by beneficiaries. The estimates suggest that both cost and altruism play important roles in targeting determination in Brazil, and thus, in the determination of the demand for redistribution. Lastly, the results indicate that ‘size matters’; i.e., the budget for redistribution has a positive impact on targeting. The third essay aims to empirically test the validity of the median voter model for the Brazilian case. Information on municipalities are obtained from the Population Census and the Brazilian Supreme Electoral Court for the year 2000. First, the median voter demand for local public services is estimated. The bundles of services offered by reelection candidates are identified as the expenditures realized during incumbents’ first term in office. The assumption of perfect information of candidates concerning the median demand is relaxed and a weaker hypothesis, of rational expectation, is imposed. Thus, incumbents make mistakes about the median demand that are referred to as misperception errors. Thus, at a given point in time, incumbents can provide a bundle (given by the amount of expenditures per capita) that differs from median voter’s demand for public services by a multiplicative error term, which is included in the residuals of the demand equation. Next, it is estimated the impact of the module of this misperception error on the electoral performance of incumbents using a selection models. The result suggests that the median voter model is valid for the case of Brazilian municipalities.
Resumo:
We estimate the impact of having attended center-based daycare institutions during early childhood on Math test scores at the 4th grade of elementary school. Because enrollment in daycare centers may depend on unobservable character-istics of the family and the child, we build and estimate a structural model of endogeneous choice of school to deal with the selectivity problem. We nd that attendance to daycare institutions is associated with a gain of approximately 0,04 standard deviation in Math test scores. This result is important to the extent our OLS results as well as most of the studies for Brazil nd no e¤ect associated to daycare attendance, suggesting selectivity may play a role on this finding.
Resumo:
Os estudos sobre consumo de etanol para veículos leves no Brasil geram bastante interesse para pesquisadores de diversas partes do mundo, dada a possibilidade de investigar características chaves sobre o comportamento do consumidor desse produto. Esta dissertação propõe um estudo que estima a equação de demanda de etanol no Brasil com o objetivo de investigar a existência de inércia na decisão de consumo, que neste trabalho chamaremos de hábito. Foi ajustado um modelo econométrico em dois estágios com dados mensais da ANP, utilizando-se variáveis instrumentais que controlaram a endogeneidade dos preços, para procurar evidências empíricas sobre a influência da inércia na decisão de consumo. Os estados foram classificados em termos das paridades de preço etanol-gasolina (próximos ou distantes do valor de 70%) e em termos de renda (ricos ou pobres). A análise foi dividida em dois períodos para se capturar a influência da entrada da frota flex-fuel na economia brasileira. Por fim foram construídos cenários baseados em médias móveis (para o cálculo das paridades de preço dos combustíveis) para investigar a influência do hábito na decisão de consumo. Concluiu-se que há diferenças significativas nos valores das elasticidades-preço próprias e cruzadas entre os dois períodos estudados e para as diferentes faixas de paridades de preços e classificação de renda. Evidências da influência da inércia na decisão do consumo foram encontradas apenas para os estados classificados como ricos, pois se encontrou diferenças nas magnitudes das elasticidades de preços entre faixas de paridade apenas ao se considerar os critérios de maior estabilidade no tempo, fornecendo indícios de que consumidores desses estados são capazes de adotar uma opção menos vantajosa (em termos de rendimento) como consequência da inércia na decisão de consumo. Para os estados classificados como pobre isto não foi evidenciado.
Resumo:
In an economy which primitives are exactly those in Mirrlees (1971), we investigate the efficiency of labor income tax schedules derived under the equal sacrifice principle. Starting from a given government revenue level, we use Werning’s (2007b) approach to assess whether there is an alternative tax schedule to the one derived under the equal sacrifice principle that raises more revenue while delivering less utility to no one. For our preferred parametrizations of the problem we find that inefficiency only arises at very high levels of income. We also show how the multipliers of the Pareto problem may be extracted from the data and used to find the implicit marginal social weights associated with each level of income.
Resumo:
This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data
Resumo:
Pair trading is an old and well-known technique among traders. In this paper, we discuss an important element not commonly debated in Brazil: the cointegration between pairs, which would guarantee the spread stability. We run the Dickey-Fuller test to check cointegration, and then compare the results with non-cointegrated pairs. We found that the Sharpe ratio of cointegrated pairs is greater than the non-cointegrated. We also use the Ornstein-Uhlenbeck equation in order to calculate the half-life of the pairs. Again, this improves their performance. Last, we use the leverage suggested by Kelly Formula, once again improving the results.
Resumo:
We construct a model in which a first mover decides on its location before it knows the identity of the second mover; joint location results in a negative extemality. Contracts are inherently incomplete since the first mover's initial decision cannot be specified. We analyze several kinds of rights, including damages, injunctions, and rights to exclude (arising from covenants or land ownership). There are cases in which allocating any of these basic rights to the first mover-i.e., first-party rights-is dominated by second-party rights, and cases in which the reverse is true. A Coasian result (efficiency regardless of the rights allocation) only holds under a limited set of conditions. As corollaries of a theorem ranking the basic rights regimes, a number of results emerge contradicting conventional wisdom, including the relative inefficiency of concentrated land ownership and the relevance of the generator's identity. We conclude with a mechanism and a new rights regime that each yield the first best in all cases.
Resumo:
We evaluate the forecasting performance of a number of systems models of US shortand long-term interest rates. Non-linearities, induding asymmetries in the adjustment to equilibrium, are shown to result in more accurate short horizon forecasts. We find that both long and short rates respond to disequilibria in the spread in certain circumstances, which would not be evident from linear representations or from single-equation analyses of the short-term interest rate.