341 resultados para Taille d’échantillon
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.
Resumo:
The model studies information sharing and the stability of cooperation in cost reducing Research Joint Ventures (RJVs). In a four-stage game-theoretic framework, firms decide on participation in a RJV, information sharing, R&D expenditures, and output. An important feature of the model is that voluntary information sharing between cooperating firms increases information leakage from the RJV to outsiders. It is found that it is the spillover from the RJV to outsiders which determines the decision of insiders whether to share information, while it is the spillover affecting all firms which determines the level of information sharing within the RJV. RJVs representing a larger portion of firms in the industry are more likely to share information. It is also found that when sharing information is costless, firms never choose intermediate levels of information sharing : they share all the information or none at all. The size of the RJV is found to depend on three effects : a coordination effect, an information sharing effect, and a competition effect. Depending on the relative magnitudes of these effects, the size of the RJV may increase or decrease with spillovers. The effect of information sharing on the profitability of firms as well as on welfare is studied.
Resumo:
This paper examines several families of population principles in the light of a set of axioms. In addition to the critical-level utilitarian, number-sensitive critical-level utilitarian and number-dampened families and their generalized counterparts, we consider the restricted number-dampened family (suggested by Hurka) and introduce two new families : the restricted critical-level and restricted number-dependent critical-level families. Subsets of the restricted families have nonnegative critical levels and avoid both the repugnant and sadistic conclusions but fail to satisfy an important independence condition. We defend the critical-level principles with positive critical levels.
Resumo:
In this paper, we study several tests for the equality of two unknown distributions. Two are based on empirical distribution functions, three others on nonparametric probability density estimates, and the last ones on differences between sample moments. We suggest controlling the size of such tests (under nonparametric assumptions) by using permutational versions of the tests jointly with the method of Monte Carlo tests properly adjusted to deal with discrete distributions. We also propose a combined test procedure, whose level is again perfectly controlled through the Monte Carlo test technique and has better power properties than the individual tests that are combined. Finally, in a simulation experiment, we show that the technique suggested provides perfect control of test size and that the new tests proposed can yield sizeable power improvements.
Resumo:
Dans Ce Texte Nous Examinons a L'aide D'un Echantillon de 560 Menages et des Moindres Carrees Ordinaires les Determinants de la Consommation Residentielle D'eau a Ville Saint-Laurent (Quebec) En 1978. Nos Resultats Nous Indiquent Que L'evaluation Fonciere des Proprietes, le Nombre de Pieces Par Logement et la Taille du Menage Sont Trois Variables Qui Expliquent la Consommation D'eau de Menages Habitant des Residences Unifamiliales Ou des Logements Dans des Duplex, Triplex Ou Quadruplex. Nos Resultats Sont Similaires a Ceux D'etudes Americaines et a Ceux Obtenus Pour Ville Saint-Leonard (Quebec) En 1977.
Resumo:
In this paper, we test a version of the conditional CAPM with respect to a local market portfolio, proxied by the Brazilian stock index during the 1976-1992 period. We also test a conditional APT model by using the difference between the 30-day rate (Cdb) and the overnight rate as a second factor in addition to the market portfolio in order to capture the large inflation risk present during this period. The conditional CAPM and APT models are estimated by the Generalized Method of Moments (GMM) and tested on a set of size portfolios created from a total of 25 securities exchanged on the Brazilian markets. The inclusion of this second factor proves to be crucial for the appropriate pricing of the portfolios.
Resumo:
We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.
Resumo:
In this paper, we look at how labor market conditions at different points during the tenure of individuals with firms are correlated with current earnings. Using data on individuals from the German Socioeconomic Panel for the 1985-1994 period, we find that both the contemporaneous unemployment rate and prior values of the unemployment rate are significantly correlated with current earnings, contrary to results for the American labor market. Estimated elasticities vary between 9 and 15 percent for the elasticity of earnings with respect to current unemployment rates, and between 6 and 10 percent with respect to unemployment rates at the start of current firm tenure. Moreover, whereas local unemployment rates determine levels of earnings, national rates influence contemporaneous variations in earnings. We interpret this result as evidence that German unions do, in fact, bargain over wages and employment, but that models of individualistic contracts, such as the implicit contract model, may explain some of the observed wage drift and longer-term wage movements reasonably well. Furthermore, we explore the heterogeneity of contracts over a variety of worker and job characteristics. In particular, we find evidence that contracts differ across firm size and worker type. Workers of large firms are remarkably more insulated from the job market than workers for any other type of firm, indicating the importance of internal job markets.
Resumo:
We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.
Resumo:
We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.
Resumo:
"Mémoire présenté à la Faculté des études supérieures en vue de l'obtention du grade de maîtrise en droit (LL.M.)"
Resumo:
L’injection de cellules souches provenant de la moelle osseuse est reconnue pour améliorer la fonction ventriculaire ainsi que le remodelage cicatriciel après un infarctus du myocarde (IM). Le Stromal Cell-derived factor-1 alpha (SDF-1 alpha), une chimiokine induite par l’ischémie cardiaque, représente une grande importance en raison de son rôle dans le recrutement de cellules inflammatoires et de cellules souches de la moelle osseuse vers les sites endommagés. Quoique les recherches sur le rôle de la chimiokine SDF-1 alpha dans le remodelage ventriculaire se multiplient, son implication dans la phase aiguë du remodelage reste inexplorée. Le but de la présente étude est de déterminer l’effet du SDF-1 alpha sur la taille de la cicatrice, l’hypertrophie cardiaque ainsi que la fonction ventriculaire chez des rats et des souris une semaine après un IM. La stratégie utilisée implique l’administration de l’AMD3100 (1 mg/kg, 24 heures après l’IM, pendant 6 jours), l’antagoniste sélectif du récepteur du SDF-1 alpha, le CXCR4. Ce récepteur est couplé à une protéine G alpha i et induit la migration et la prolifération cellulaire. Chez les rats du groupe IM, l’expression de la chimiokine a été détectée surtout dans les cellules musculaires lisses et les cellules endothéliales des vaisseaux cicatriciels. Le profil d’expression de la chimiokine dans le cœur infarci indique un gradient de concentration vers la cicatrice. Une semaine après l’IM, le traitement avec l’AMD3100 a diminué la taille de la cicatrice, résultant en une amélioration de la fonction ventriculaire et une diminution de l’élévation de l’expression de l’ARNm de l’ANP dans le ventricule gauche non infarci (VGNI). Chez les souris, le traitement avec l’AMD3100 a engendré les mêmes effets, soit une diminution de la taille de la cicatrice ainsi qu’une amélioration de la fonction ventriculaire. La réduction de la taille de la région infarcie chez les souris traitées avec l’AMD3100 est associée avec une atténuation de l’infiltration des neutrophiles dans la région ischémique. Ces résultats suggèrent que le blocage pharmacologique de l’axe SDF-1 alpha/CXCR4 lors de la phase aiguë du remodelage ventriculaire après un IM diminue la taille de la cicatrice et améliore la fonction ventriculaire, en partie, par la diminution de la réaction inflammatoire.