998 resultados para [JEL:C13] Math
Resumo:
The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
Resumo:
In this paper, we propose exact inference procedures for asset pricing models that can be formulated in the framework of a multivariate linear regression (CAPM), allowing for stable error distributions. The normality assumption on the distribution of stock returns is usually rejected in empirical studies, due to excess kurtosis and asymmetry. To model such data, we propose a comprehensive statistical approach which allows for alternative - possibly asymmetric - heavy tailed distributions without the use of large-sample approximations. The methods suggested are based on Monte Carlo test techniques. Goodness-of-fit tests are formally incorporated to ensure that the error distributions considered are empirically sustainable, from which exact confidence sets for the unknown tail area and asymmetry parameters of the stable error distribution are derived. Tests for the efficiency of the market portfolio (zero intercepts) which explicitly allow for the presence of (unknown) nuisance parameter in the stable error distribution are derived. The methods proposed are applied to monthly returns on 12 portfolios of the New York Stock Exchange over the period 1926-1995 (5 year subperiods). We find that stable possibly skewed distributions provide statistically significant improvement in goodness-of-fit and lead to fewer rejections of the efficiency hypothesis.
Resumo:
We consider the problem of testing whether the observations X1, ..., Xn of a time series are independent with unspecified (possibly nonidentical) distributions symmetric about a common known median. Various bounds on the distributions of serial correlation coefficients are proposed: exponential bounds, Eaton-type bounds, Chebyshev bounds and Berry-Esséen-Zolotarev bounds. The bounds are exact in finite samples, distribution-free and easy to compute. The performance of the bounds is evaluated and compared with traditional serial dependence tests in a simulation experiment. The procedures proposed are applied to U.S. data on interest rates (commercial paper rate).
Resumo:
Intertemporal social-evaluation rules provide us with social criteria that can be used to assess the relative desirability of utility distributions across generations. The trade-offs between the well-being of different generations implicit in each such rule reflect the underlying ethical position on issues of intergenerational equity or justice. We employ an axiomatic approach in order to identify ethically attractive socialevaluation procedures. In particular, we explore the possibilities of using welfare information and non-welfare information in a model of intertemporal social evaluation. We focus on the individuals’ birth dates and lengths of life as the relevant non-welfare information. As usual, welfare information is given by lifetime utilities. It is assumed that this information is available for each alternative to be ranked. Various weakenings of the Pareto principle are employed in order to allow birth dates or lengths of life (or both) to matter in social evaluation. In addition, we impose standard properties such as continuity and anonymity and we examine the consequences of an intertemporal independence property. For each of the Pareto conditions employed, we characterize all social-evaluation rules satisfying it and our other axioms. The resulting rules are birth-date dependent or lifetime-dependent versions of generalized utilitarianism. Furthermore, we discuss the ethical and axiomatic foundations of geometric discounting in the context of our model.
Resumo:
Cet article illustre l’applicabilité des méthodes de rééchantillonnage dans le cadre des tests multiples (simultanés), pour divers problèmes économétriques. Les hypothèses simultanées sont une conséquence habituelle de la théorie économique, de sorte que le contrôle de la probabilité de rejet de combinaisons de tests est un problème que l’on rencontre fréquemment dans divers contextes économétriques et statistiques. À ce sujet, on sait que le fait d’ignorer le caractère conjoint des hypothèses multiples peut faire en sorte que le niveau de la procédure globale dépasse considérablement le niveau désiré. Alors que la plupart des méthodes d’inférence multiple sont conservatrices en présence de statistiques non-indépendantes, les tests que nous proposons visent à contrôler exactement le niveau de signification. Pour ce faire, nous considérons des critères de test combinés proposés initialement pour des statistiques indépendantes. En appliquant la méthode des tests de Monte Carlo, nous montrons comment ces méthodes de combinaison de tests peuvent s’appliquer à de tels cas, sans recours à des approximations asymptotiques. Après avoir passé en revue les résultats antérieurs sur ce sujet, nous montrons comment une telle méthodologie peut être utilisée pour construire des tests de normalité basés sur plusieurs moments pour les erreurs de modèles de régression linéaires. Pour ce problème, nous proposons une généralisation valide à distance finie du test asymptotique proposé par Kiefer et Salmon (1983) ainsi que des tests combinés suivant les méthodes de Tippett et de Pearson-Fisher. Nous observons empiriquement que les procédures de test corrigées par la méthode des tests de Monte Carlo ne souffrent pas du problème de biais (ou sous-rejet) souvent rapporté dans cette littérature – notamment contre les lois platikurtiques – et permettent des gains sensibles de puissance par rapport aux méthodes combinées usuelles.
Resumo:
In practice we often face the problem of assigning indivisible objects (e.g., schools, housing, jobs, offices) to agents (e.g., students, homeless, workers, professors) when monetary compensations are not possible. We show that a rule that satisfies consistency, strategy-proofness, and efficiency must be an efficient generalized priority rule; i.e. it must adapt to an acyclic priority structure, except -maybe- for up to three agents in each object's priority ordering.
Resumo:
In this paper, we study the asymptotic distribution of a simple two-stage (Hannan-Rissanen-type) linear estimator for stationary invertible vector autoregressive moving average (VARMA) models in the echelon form representation. General conditions for consistency and asymptotic normality are given. A consistent estimator of the asymptotic covariance matrix of the estimator is also provided, so that tests and confidence intervals can easily be constructed.
Resumo:
The following properties of the core of a one well-known: (i) the core is non-empty; (ii) the core is a lattice; and (iii) the set of unmatched agents is identical for any two matchings belonging to the core. The literature on two-sided matching focuses almost exclusively on the core and studies extensively its properties. Our main result is the following characterization of (von Neumann-Morgenstern) stable sets in one-to-one matching problem only if it is a maximal set satisfying the following properties : (a) the core is a subset of the set; (b) the set is a lattice; (c) the set of unmatched agents is identical for any two matchings belonging to the set. Furthermore, a set is a stable set if it is the unique maximal set satisfying properties (a), (b) and (c). We also show that our main result does not extend from one-to-one matching problems to many-to-one matching problems.
Resumo:
Statistical tests in vector autoregressive (VAR) models are typically based on large-sample approximations, involving the use of asymptotic distributions or bootstrap techniques. After documenting that such methods can be very misleading even with fairly large samples, especially when the number of lags or the number of equations is not small, we propose a general simulation-based technique that allows one to control completely the level of tests in parametric VAR models. In particular, we show that maximized Monte Carlo tests [Dufour (2002)] can provide provably exact tests for such models, whether they are stationary or integrated. Applications to order selection and causality testing are considered as special cases. The technique developed is applied to quarterly and monthly VAR models of the U.S. economy, comprising income, money, interest rates and prices, over the period 1965-1996.
Resumo:
The rationalizability of a choice function on arbitrary domains by means of a transitive relation has been analyzed thoroughly in the literature. Moreover, characterizations of various versions of consistent rationalizability have appeared in recent contributions. However, not much seems to be known when the coherence property of quasi-transitivity or that of P-acyclicity is imposed on a rationalization. The purpose of this paper is to fill this significant gap. We provide characterizations of all forms of rationalizability involving quasi-transitive or P-acyclical rationalizations on arbitrary domains.
Resumo:
Pérez-Castrillo and Wettstein (2002) propose a multi-bidding mechanism to determine a winner from a set of possible projects. The winning project is implemented and its surplus is shared among the agents. In the multi-bidding mechanism each agent announces a vector of bids, one for each possible project, that are constrained to sum up to zero. In addition, each agent chooses a favorite a object which is used as a tie-breaker if several projects receive the same highest aggregate bid. Since more desirable projects receive larger bids, it is natural to consider the multi-bidding mechanism without the announcement of favorite projects. We show that the merits of the multi-bidding mechanism appear not to be robust to this natural simplification. Specifically, a Nash equilibrium exists if and only if there are at least two individually optimal projects and all individually optimal projects are efficient.
Resumo:
We derive conditions that must be satisfied by the primitives of the problem in order for an equilibrium in linear Markov strategies to exist in some common property natural resource differential games. These conditions impose restrictions on the admissible form of the natural growth function, given a benefit function, or on the admissible form of the benefit function, given a natural growth function.
Resumo:
This paper constructs and estimates a sticky-price, Dynamic Stochastic General Equilibrium model with heterogenous production sectors. Sectors differ in price stickiness, capital-adjustment costs and production technology, and use output from each other as material and investment inputs following an Input-Output Matrix and Capital Flow Table that represent the U.S. economy. By relaxing the standard assumption of symmetry, this model allows different sectoral dynamics in response to monetary policy shocks. The model is estimated by Simulated Method of Moments using sectoral and aggregate U.S. time series. Results indicate 1) substantial heterogeneity in price stickiness across sectors, with quantitatively larger differences between services and goods than previously found in micro studies that focus on final goods alone, 2) a strong sensitivity to monetary policy shocks on the part of construction and durable manufacturing, and 3) similar quantitative predictions at the aggregate level by the multi-sector model and a standard model that assumes symmetry across sectors.
Resumo:
The purpose of this paper is to characterize the optimal time paths of production and water usage by an agricultural and an oil sector that have to share a limited water resource. We show that for any given water stock, if the oil stock is sufficiently large, it will become optimal to have a phase during which the agricultural sector is inactive. This may mean having an initial phase during which the two sectors are active, then a phase during which the water is reserved for the oil sector and the agricultural sector is inactive, followed by a phase during which both sectors are active again. The agricultural sector will always be active in the end as the oil stock is depleted and the demand for water from the oil sector decreases. In the case where agriculture is not constrained by the given natural inflow of water once there is no more oil, we show that oil extraction will always end with a phase during which oil production follows a pure Hotelling path, with the implicit price of oil net of extraction cost growing at the rate of interest. If the natural inflow of water does constitute a constraint for agriculture, then oil production never follows a pure Hotelling path, because its full marginal cost must always reflect not only the imputed rent on the finite oil stock, but also the positive opportunity cost of water.
Resumo:
This paper documents and discusses a dramatic change in the cyclical behavior of aggregate hours worked by individuals with a college degree (skilled workers) since the mid-1980’s. Using the CPS outgoing rotation data set for the period 1979:1-2003:4, we find that the volatility of aggregate skilled hours relative to the volatility of GDP has nearly tripled since 1984. In contrast, the cyclical properties of unskilled hours have remained essentially unchanged. We evaluate the extent to which a simple supply/demand model for skilled and unskilled labor with capital-skill complementarity in production can help explain this stylized fact. Within this framework, we identify three effects which would lead to an increase in the relative volatility of skilled hours: (i) a reduction in the degree of capital-skill complementarity, (ii) a reduction in the absolute volatility of GDP (and unskilled hours), and (iii) an increase in the level of capital equipment relative to skilled labor. We provide empirical evidence in support of each of these effects. Our conclusion is that these three mechanisms can jointly explain about sixty percent of the observed increase in the relative volatility of skilled labor. The reduction in the degree of capital-skill complementarity contributes the most to this result.