749 resultados para Critères de Copenhague
Resumo:
We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.
Resumo:
We study the assignment of indivisible objects with quotas (houses, jobs, or offices) to a set of agents (students, job applicants, or professors). Each agent receives at most one object and monetary compensations are not possible. We characterize efficient priority rules by efficiency, strategy-proofness, and reallocation-consistency. Such a rule respects an acyclical priority structure and the allocations can be determined using the deferred acceptance algorithm.
Resumo:
We study a simple model of assigning indivisible objects (e.g., houses, jobs, offices, etc.) to agents. Each agent receives at most one object and monetary compensations are not possible. We completely describe all rules satisfying efficiency and resource-monotonicity. The characterized rules assign the objects in a sequence of steps such that at each step there is either a dictator or two agents who “trade” objects from their hierarchically specified “endowments.”
Resumo:
We consider entry-level medical markets for physicians in the United Kingdom. These markets experienced failures which led to the adoption of centralized market mechanisms in the 1960's. However, different regions introduced different centralized mechanisms. We advise physicians who do not have detailed information about the rank-order lists submitted by the other participants. We demonstrate that in each of these markets in a low information environment it is not beneficial to reverse the true ranking of any two acceptable hospital positions. We further show that (i) in the Edinburgh 1967 market, ranking unacceptable matches as acceptable is not profitable for any participant and (ii) in any other British entry-level medical market, it is possible that only strategies which rank unacceptable positions as acceptable are optimal for a physician.
Resumo:
Social exclusion manifests itself in the lack of an individual’s access to functionings as compared to other members of society. Thus, the concept is closely related to deprivation. We view deprivation as having two basic determinants: the lack of identification with other members of society and the aggregate alienation experienced by an agent with respect to those with fewer functioning failures. We use an axiomatic approach to characterize classes of deprivation and exclusion measures and apply some of them to EU data for the period from 1994 to 2000.
Resumo:
This note reexamines the single-profile approach to social-choice theory. If an alternative is interpreted as a social state of affairs or a history of the world, it can be argued that a multi-profile approach is inappropriate because the information profile is determined by the set of alternatives. However, single-profile approaches are criticized because of the limitations they impose on the possibility of formulating properties such as anonymity. We suggest an alternative definition of anonymity that applies in a single-profile setting and characterize anonymous single-profile welfarism under a richness assumption.
Resumo:
This paper, which is to be published as a chapter in the Oxford Handbook of Political Economy, provides an introduction to social-choice theory with interpersonal comparisons of well-being. We argue that the most promising route of escape from the negative conclusion of Arrow’s theorem is to use a richer informational environment than ordinal measurability and the absence of interpersonal comparability of well-being. We discuss welfarist social evaluation (which requires that the levels of individual well-being in two alternatives are the only determinants of their social ranking) and present characterizations of some important social-evaluation orderings.
Resumo:
We reconsider the following cost-sharing problem: agent i = 1,...,n demands a quantity xi of good i; the corresponding total cost C(x1,...,xn) must be shared among the n agents. The Aumann-Shapley prices (p1,...,pn) are given by the Shapley value of the game where each unit of each good is regarded as a distinct player. The Aumann-Shapley cost-sharing method assigns the cost share pixi to agent i. When goods come in indivisible units, we show that this method is characterized by the two standard axioms of Additivity and Dummy, and the property of No Merging or Splitting: agents never find it profitable to split or merge their demands.
Resumo:
This paper revisits Diamond’s classical impossibility result regarding the ordering of infinite utility streams. We show that if no representability condition is imposed, there do exist strongly Paretian and finitely anonymous orderings of intertemporal utility streams with attractive additional properties. We extend a possibility theorem due to Svensson to a characterization theorem and we provide characterizations of all strongly Paretian and finitely anonymous rankings satisfying the strict transfer principle. In addition, infinite horizon extensions of leximin and of utilitarianism are characterized by adding an equity preference axiom and finite translation-scale measurability, respectively, to strong Pareto and finite anonymity.
Resumo:
We provide an axiomatization of Yitzhaki’s index of individual deprivation. Our result differs from an earlier characterization due to Ebert and Moyes in the way the reference group of an individual is represented in the model. Ebert and Moyes require the index to be defined for all logically possible reference groups, whereas we employ the standard definition of the reference group as the set of all agents in a society. As a consequence of this modification, some of the axioms used by Ebert and Moyes can no longer be applied and we provide alternative formulations.
Resumo:
This paper proposes a definition of relative uncertainty aversion for decision models under complete uncertainty. It is shown that, for a large class of decision rules characterized by a set of plausible axioms, the new criterion yields a complete ranking of those rules with respect to the relative degree of uncertainty aversion they represent. In addition, we address a combinatorial question that arises in this context, and we examine conditions for the additive representability of our rules.
Resumo:
A group of agents participate in a cooperative enterprise producing a single good. Each participant contributes a particular type of input; output is nondecreasing in these contributions. How should it be shared? We analyze the implications of the axiom of Group Monotonicity: if a group of agents simultaneously decrease their input contributions, not all of them should receive a higher share of output. We show that in combination with other more familiar axioms, this condition pins down a very small class of methods, which we dub nearly serial.
Resumo:
Intertemporal social-evaluation rules provide us with social criteria that can be used to assess the relative desirability of utility distributions across generations. The trade-offs between the well-being of different generations implicit in each such rule reflect the underlying ethical position on issues of intergenerational equity or justice. We employ an axiomatic approach in order to identify ethically attractive socialevaluation procedures. In particular, we explore the possibilities of using welfare information and non-welfare information in a model of intertemporal social evaluation. We focus on the individuals’ birth dates and lengths of life as the relevant non-welfare information. As usual, welfare information is given by lifetime utilities. It is assumed that this information is available for each alternative to be ranked. Various weakenings of the Pareto principle are employed in order to allow birth dates or lengths of life (or both) to matter in social evaluation. In addition, we impose standard properties such as continuity and anonymity and we examine the consequences of an intertemporal independence property. For each of the Pareto conditions employed, we characterize all social-evaluation rules satisfying it and our other axioms. The resulting rules are birth-date dependent or lifetime-dependent versions of generalized utilitarianism. Furthermore, we discuss the ethical and axiomatic foundations of geometric discounting in the context of our model.
Resumo:
Cet article illustre l’applicabilité des méthodes de rééchantillonnage dans le cadre des tests multiples (simultanés), pour divers problèmes économétriques. Les hypothèses simultanées sont une conséquence habituelle de la théorie économique, de sorte que le contrôle de la probabilité de rejet de combinaisons de tests est un problème que l’on rencontre fréquemment dans divers contextes économétriques et statistiques. À ce sujet, on sait que le fait d’ignorer le caractère conjoint des hypothèses multiples peut faire en sorte que le niveau de la procédure globale dépasse considérablement le niveau désiré. Alors que la plupart des méthodes d’inférence multiple sont conservatrices en présence de statistiques non-indépendantes, les tests que nous proposons visent à contrôler exactement le niveau de signification. Pour ce faire, nous considérons des critères de test combinés proposés initialement pour des statistiques indépendantes. En appliquant la méthode des tests de Monte Carlo, nous montrons comment ces méthodes de combinaison de tests peuvent s’appliquer à de tels cas, sans recours à des approximations asymptotiques. Après avoir passé en revue les résultats antérieurs sur ce sujet, nous montrons comment une telle méthodologie peut être utilisée pour construire des tests de normalité basés sur plusieurs moments pour les erreurs de modèles de régression linéaires. Pour ce problème, nous proposons une généralisation valide à distance finie du test asymptotique proposé par Kiefer et Salmon (1983) ainsi que des tests combinés suivant les méthodes de Tippett et de Pearson-Fisher. Nous observons empiriquement que les procédures de test corrigées par la méthode des tests de Monte Carlo ne souffrent pas du problème de biais (ou sous-rejet) souvent rapporté dans cette littérature – notamment contre les lois platikurtiques – et permettent des gains sensibles de puissance par rapport aux méthodes combinées usuelles.
Resumo:
In practice we often face the problem of assigning indivisible objects (e.g., schools, housing, jobs, offices) to agents (e.g., students, homeless, workers, professors) when monetary compensations are not possible. We show that a rule that satisfies consistency, strategy-proofness, and efficiency must be an efficient generalized priority rule; i.e. it must adapt to an acyclic priority structure, except -maybe- for up to three agents in each object's priority ordering.