148 resultados para Armington Assumption
Resumo:
This paper is concerned with the realism of mechanisms that implementsocial choice functions in the traditional sense. Will agents actually playthe equilibrium assumed by the analysis? As an example, we study theconvergence and stability properties of Sj\"ostr\"om's (1994) mechanism, onthe assumption that boundedly rational players find their way to equilibriumusing monotonic learning dynamics and also with fictitious play. Thismechanism implements most social choice functions in economic environmentsusing as a solution concept the iterated elimination of weakly dominatedstrategies (only one round of deletion of weakly dominated strategies isneeded). There are, however, many sets of Nash equilibria whose payoffs maybe very different from those desired by the social choice function. Withmonotonic dynamics we show that many equilibria in all the sets ofequilibria we describe are the limit points of trajectories that havecompletely mixed initial conditions. The initial conditions that lead tothese equilibria need not be very close to the limiting point. Furthermore,even if the dynamics converge to the ``right'' set of equilibria, it stillcan converge to quite a poor outcome in welfare terms. With fictitious play,if the agents have completely mixed prior beliefs, beliefs and play convergeto the outcome the planner wants to implement.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
One of the principle aims of the Working Families' Tax Credit in the UK was to increase the participation of single mothers. The literature to date concludes there was approximately a five-percentage-point increase in employment of single mothers. The differences-in-differences methodology that is typically used compares single mother with single women without children. However, the characteristics of these groups are very different, and change over time in relative covariates are likely to violate the identifying assumption. We find that when we control for differential trends between women with and without children, the employment effect of the policy falls significantly. Moreover, the effect is borne solely by those working full-time (30 hours or more), while having no effect on inducing people into the labor market from inactivity. Looking closely at important covariates over time, we can see sizeable changes in the relative returns to employment between the treatment and control groups.
Resumo:
We study a general equilibrium model in which entrepreneurs finance investment with optimal financial contracts. Because of enforceability problems, contracts are constrained efficient. We show that limited enforceability amplifies the impact of technological innovations on aggregate output. More generally, we show that lower enforceability of contracts will be associated with greater aggregate volatility. A key assumption for this result is that defaulting entrepreneurs are not excluded from the market.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
This paper analyzes the problem of abnormally low tenders in theprocurement process. Limited liability causes firms in a bad financialsituation to bid more aggressively than good firms in the procurementauction. Therefore, it is more likely that the winning firm is a firm infinancial difficulties with a high risk of bankruptcy. The paper analyzesthe different regulatory practices to face this problem with a specialemphasis on surety bonds used e.g. in the US. We characterize the optimalsurety bond and show that it does not coincide with the current USregulation. In particular we show that under a natural assumption the USregulation is too expensive and provides overinsurance to the problem ofabnormally low tenders.
Resumo:
Applying the competing--risks model to multi--cause mortality,this paper provides a theoretical and empirical investigation of the positive complementarities that occur between disease--specific policy interventions. We argue that since an individual cannot die twice, competing risks imply that individuals will not waste resources on causes that are not the most immediate, but will make health investments so as to equalize cause--specific mortality. However, equal mortality risk from a variety of diseases does not imply that disease--specific public health interventions are a waste. Rather, a cause--specific intervention produces spillovers to other disease risks, so that the overall reduction in mortality will generally be larger than the direct effect measured on the targeted disease. The assumption that mortality from non--targeted diseases remains the same after a cause--specific intervention under--estimates the true effect of such programs, since the background mortality is also altered as a result of intervention. Analyzing data from one of the most important public health programs ever introduced, the Expanded Program on Immunization (EPI) of the United Nations, we find evidence for the existence of such complementarities, involving causes that are not biomedically, but behaviorally, linked.
Resumo:
A class of composite estimators of small area quantities that exploit spatial (distancerelated)similarity is derived. It is based on a distribution-free model for the areas, but theestimators are aimed to have optimal design-based properties. Composition is applied alsoto estimate some of the global parameters on which the small area estimators depend.It is shown that the commonly adopted assumption of random effects is not necessaryfor exploiting the similarity of the districts (borrowing strength across the districts). Themethods are applied in the estimation of the mean household sizes and the proportions ofsingle-member households in the counties (comarcas) of Catalonia. The simplest version ofthe estimators is more efficient than the established alternatives, even though the extentof spatial similarity is quite modest.
Resumo:
We propose a stylized model of a problem-solving organization whoseinternal communication structure is given by a fixed network. Problemsarrive randomly anywhere in this network and must find their way to theirrespective specialized solvers by relying on local information alone.The organization handles multiple problems simultaneously. For this reason,the process may be subject to congestion. We provide a characterization ofthe threshold of collapse of the network and of the stock of foatingproblems (or average delay) that prevails below that threshold. We buildupon this characterization to address a design problem: the determinationof what kind of network architecture optimizes performance for any givenproblem arrival rate. We conclude that, for low arrival rates, the optimalnetwork is very polarized (i.e. star-like or centralized ), whereas it islargely homogenous (or decentralized ) for high arrival rates. We also showthat, if an auxiliary assumption holds, the transition between these twoopposite structures is sharp and they are the only ones to ever qualify asoptimal.
Resumo:
Can we reconcile the predictions of the altruism model of the familywith the evidence on intervivos transfers in the US? This paper expandsthe altruism model by introducing e ?ort of the child and by relaxingthe assumption of perfect information of the parent about the labormarket opportunities of the child. First, I solve and simulate a modelof altruism under imperfect information. Second, I use cross-sectionaldata to test a prediction of the model: Are parental transfers especiallyresponsive to the income variations of children who are very attached tothe labor market? The results suggest that imperfect information accountsfor several patterns of intergenerational transfers in the US.
Resumo:
We analyze a model of conflict with endogenous choice of effort, wheresubsets of the contenders may force the resolution to be sequential:First the alliance fights it out with the rest and in case they win later they fight it out among themselves. For three-player games, wefind that it will not be in the interest of any two of them to form analliance. We obtain this result under two different scenarios:equidistant preferences with varying relative strengths, and vicinityof preferences with equal distribution of power. We conclude that thecommonly made assumption of super-additive coalitional worth is suspect.
Resumo:
Can we reconcile the predictions of the altruism model of the family withthe evidence on parental monetary transfers in the US? This paper providesa new assessment of this question. I expand the altruism model by introducingeffort of the child and by relaxing the assumption of perfect informationof the parent about the labor market opportunities of the child. First,I solve and simulate a model of altruism and labor supply under imperfectinformation. Second, I use cross-sectional data to test the following prediction of the model: Are parental transfers especially responsive tothe income variations of children who are very attached to the labor market? The results of the analysis suggest that imperfect informationaccounts for many of the patterns of intergenerational transfers in theUS.
Resumo:
The present paper revisits a property embedded in most dynamic macroeconomic models: the stationarity of hours worked. First, I argue that, contrary to what is often believed, there are many reasons why hours could be nonstationary in those models, while preserving the property of balanced growth. Second, I show that the postwar evidence for most industrialized economies is clearly at odds with the assumption of stationary hours per capita. Third, I examine the implications of that evidence for the role of technology as a source of economic fluctuations in the G7 countries.
Resumo:
This paper uses a model of boundedly rational learning to accountfor the observations of recurrent hyperinflations in the lastdecade. We study a standard monetary model where the fullyrational expectations assumption is replaced by a formaldefinition of quasi-rational learning. The model under learningis able to match remarkably well some crucial stylized factsobserved during the recurrent hyperinflations experienced byseveral countries in the 80's. We argue that, despite being asmall departure from rational expectations, quasi-rationallearning does not preclude falsifiability of the model and itdoes not violate reasonable rationality requirements.