828 resultados para inférence à distance finie


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cet article illustre l’applicabilité des méthodes de rééchantillonnage dans le cadre des tests multiples (simultanés), pour divers problèmes économétriques. Les hypothèses simultanées sont une conséquence habituelle de la théorie économique, de sorte que le contrôle de la probabilité de rejet de combinaisons de tests est un problème que l’on rencontre fréquemment dans divers contextes économétriques et statistiques. À ce sujet, on sait que le fait d’ignorer le caractère conjoint des hypothèses multiples peut faire en sorte que le niveau de la procédure globale dépasse considérablement le niveau désiré. Alors que la plupart des méthodes d’inférence multiple sont conservatrices en présence de statistiques non-indépendantes, les tests que nous proposons visent à contrôler exactement le niveau de signification. Pour ce faire, nous considérons des critères de test combinés proposés initialement pour des statistiques indépendantes. En appliquant la méthode des tests de Monte Carlo, nous montrons comment ces méthodes de combinaison de tests peuvent s’appliquer à de tels cas, sans recours à des approximations asymptotiques. Après avoir passé en revue les résultats antérieurs sur ce sujet, nous montrons comment une telle méthodologie peut être utilisée pour construire des tests de normalité basés sur plusieurs moments pour les erreurs de modèles de régression linéaires. Pour ce problème, nous proposons une généralisation valide à distance finie du test asymptotique proposé par Kiefer et Salmon (1983) ainsi que des tests combinés suivant les méthodes de Tippett et de Pearson-Fisher. Nous observons empiriquement que les procédures de test corrigées par la méthode des tests de Monte Carlo ne souffrent pas du problème de biais (ou sous-rejet) souvent rapporté dans cette littérature – notamment contre les lois platikurtiques – et permettent des gains sensibles de puissance par rapport aux méthodes combinées usuelles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes finite-sample procedures for testing the SURE specification in multi-equation regression models, i.e. whether the disturbances in different equations are contemporaneously uncorrelated or not. We apply the technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] to obtain exact tests based on standard LR and LM zero correlation tests. We also suggest a MC quasi-LR (QLR) test based on feasible generalized least squares (FGLS). We show that the latter statistics are pivotal under the null, which provides the justification for applying MC tests. Furthermore, we extend the exact independence test proposed by Harvey and Phillips (1982) to the multi-equation framework. Specifically, we introduce several induced tests based on a set of simultaneous Harvey/Phillips-type tests and suggest a simulation-based solution to the associated combination problem. The properties of the proposed tests are studied in a Monte Carlo experiment which shows that standard asymptotic tests exhibit important size distortions, while MC tests achieve complete size control and display good power. Moreover, MC-QLR tests performed best in terms of power, a result of interest from the point of view of simulation-based tests. The power of the MC induced tests improves appreciably in comparison to standard Bonferroni tests and, in certain cases, outperforms the likelihood-based MC tests. The tests are applied to data used by Fischer (1993) to analyze the macroeconomic determinants of growth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les données provenant de l'échantillonnage fin d'un processus continu (champ aléatoire) peuvent être représentées sous forme d'images. Un test statistique permettant de détecter une différence entre deux images peut être vu comme un ensemble de tests où chaque pixel est comparé au pixel correspondant de l'autre image. On utilise alors une méthode de contrôle de l'erreur de type I au niveau de l'ensemble de tests, comme la correction de Bonferroni ou le contrôle du taux de faux-positifs (FDR). Des méthodes d'analyse de données ont été développées en imagerie médicale, principalement par Keith Worsley, utilisant la géométrie des champs aléatoires afin de construire un test statistique global sur une image entière. Il s'agit d'utiliser l'espérance de la caractéristique d'Euler de l'ensemble d'excursion du champ aléatoire sous-jacent à l'échantillon au-delà d'un seuil donné, pour déterminer la probabilité que le champ aléatoire dépasse ce même seuil sous l'hypothèse nulle (inférence topologique). Nous exposons quelques notions portant sur les champs aléatoires, en particulier l'isotropie (la fonction de covariance entre deux points du champ dépend seulement de la distance qui les sépare). Nous discutons de deux méthodes pour l'analyse des champs anisotropes. La première consiste à déformer le champ puis à utiliser les volumes intrinsèques et les compacités de la caractéristique d'Euler. La seconde utilise plutôt les courbures de Lipschitz-Killing. Nous faisons ensuite une étude de niveau et de puissance de l'inférence topologique en comparaison avec la correction de Bonferroni. Finalement, nous utilisons l'inférence topologique pour décrire l'évolution du changement climatique sur le territoire du Québec entre 1991 et 2100, en utilisant des données de température simulées et publiées par l'Équipe Simulations climatiques d'Ouranos selon le modèle régional canadien du climat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les analyses effectuées dans le cadre de ce mémoire ont été réalisées à l'aide du module MatchIt disponible sous l’environnent d'analyse statistique R. / Statistical analyzes of this thesis were performed using the MatchIt package available in the statistical analysis environment R.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 7.2% of the Atlantic rainforest remains in Brazil, with only 16% of this forest remaining in the State of Rio de Janeiro, all of it distributed in fragments. This forest fragmentation can produce biotic and abiotic differences between edges and the fragment interior. In this study, we compared the structure and richness of tree communities in three habitats - an anthropogenic edge (AE), a natural edge (NE) and the fragment interior (FI) - of a fragment of Atlantic forest in the State of Rio de Janeiro, Brazil (22°50'S and 42°28'W). One thousand and seventy-six trees with a diameter at breast height > 4.8 cm, belonging to 132 morphospecies and 39 families, were sampled in a total study area of 0.75 ha. NE had the greatest basal area and the trees in this habitat had the greatest diameter:height allometric coefficient, whereas AE had a lower richness and greater variation in the height of the first tree branch. Tree density, diameter, height and the proportion of standing dead trees did not differ among the habitats. There was marked heterogeneity among replicates within each habitat. These results indicate that the forest interior and the fragment edges (natural or anthropogenic) do not differ markedly considering the studied parameters. Other factors, such as the age from the edge, type of matrix and proximity of gaps, may play a more important role in plant community structure than the proximity from edges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to evaluate children's respiratory patterns in the mixed dentition, by means of acoustic rhinometry, and its relation to the upper arch width development. Fifty patients were examined, 25 females and 25 males with mean age of eight years and seven months. All of them were submitted to acoustic rhinometry and upper and lower arch impressions to obtain plaster models. The upper arch analysis was accomplished by measuring the interdental transverse distance of the upper teeth, deciduous canines (measurement 1), deciduous first molars (measurement 2), deciduous second molars (measurement 3) and the first molars (measurement 4). The results showed that an increased left nasal cavity area in females means an increased interdental distance of the deciduous first molars and deciduous second molars and an increased interdental distance of the deciduous canines, deciduous first and second molars in males. It was concluded that there is a correlation between the nasal cavity area and the upper arch transverse distance in the anterior and mid maxillary regions for both genders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rangel EM, Mendes IA, Carnio EC, Marchi Alves LM, Godoy S, Crispim JA. Development, implementation, and assessment of a distance module in endocrine physiology. Adv Physiol Educ 34: 70-74, 2010; doi: 10.1152/advan.00070.2009.-This study aimed to develop, implement, and assess a distance module in endocrine physiology in TelEduc for undergraduate nursing students from a public university in Brazil, with a sample size of 44 students. Stage 1 consisted of the development of the module, through the process of creating a distance course by means of the Web. Stage 2 was the planning of the module's practical functioning, and stage 3 was the planning of student evaluations. In the experts' assessment, the module complied with pedagogical and technical requirements most of the time. In the practical functioning stage, 10 h were dedicated for on-site activities and 10 h for distance activities. Most students (93.2%) were women between 19 and 23 yr of age (75%). The internet was the most used means to remain updated for 23 students (59.0%), and 30 students (68.2%) accessed it from the teaching institution. A personal computer was used by 23 students (56.1%), and most of them (58.1%) learned to use it alone. Access to a forum was more dispersed (variation coefficient: 86.80%) than access to chat (variation coefficient: 65.14%). Average participation was 30 students in forums and 22 students in the chat. Students' final grades in the module averaged 8.5 (SD: 1.2). TelEduc was shown to be efficient in supporting the teaching- learning process of endocrine physiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. In April 2004, the first image was obtained of a planetary mass companion (now known as 2M 1207 b) in orbit around a self-luminous object different from our own Sun (the young brown dwarf 2MASSW J 1207334-393254, hereafter 2M 1207 A). That 2M 1207 b probably formed via fragmentation and gravitational collapse offered proof that such a mechanism can form bodies in the planetary mass regime. However, the predicted mass, luminosity, and radius of 2MI207 b depend on its age, distance, and other observables, such as effective temperature. Aims. To refine our knowledge of the physical properties of 2M 1207 b and its nature, we accurately determined the distance to the 2M 1207 A and b system by measuring of its trigonometric parallax at the milliarcsec level. Methods. With the ESO NTT/SUS12 telescope, we began a campaign of photometric and astrometric observations in 2006 to measure the trigonometric parallax of 2M 1207 A. Results. An accurate distance (52.4 +/- 1.1 pc) to 2M1207A was measured. From distance and proper motions we derived spatial velocities that are fully compatible with TWA membership. Conclusions. With this new distance estimate, we discuss three scenarios regarding the nature of 2M 1207 b: (1) a cool (1150 +/- 150 K) companion of mass 4 +/- 1 M-Jup (2) a warmer (1600 +/- 100 K) and heavier (8 +/- 2 M-Jup) companion occulted by an edge-on circumsecondary disk, or (3) a hot protoplanet collision afterglow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Observations in the cosmological domain are heavily dependent on the validity of the cosmic distance-duality (DD) relation, eta = D(L)(z)(1+ z)(2)/D(A)(z) = 1, an exact result required by the Etherington reciprocity theorem where D(L)(z) and D(A)(z) are, respectively, the luminosity and angular diameter distances. In the limit of very small redshifts D(A)(z) = D(L)(z) and this ratio is trivially satisfied. Measurements of Sunyaev-Zeldovich effect (SZE) and X-rays combined with the DD relation have been used to determine D(A)(z) from galaxy clusters. This combination offers the possibility of testing the validity of the DD relation, as well as determining which physical processes occur in galaxy clusters via their shapes. Aims. We use WMAP (7 years) results by fixing the conventional Lambda CDM model to verify the consistence between the validity of DD relation and different assumptions about galaxy cluster geometries usually adopted in the literature. Methods. We assume that. is a function of the redshift parametrized by two different relations: eta(z) = 1+eta(0)z, and eta(z) = 1+eta(0)z/(1+z), where eta(0) is a constant parameter quantifying the possible departure from the strict validity of the DD relation. In order to determine the probability density function (PDF) of eta(0), we consider the angular diameter distances from galaxy clusters recently studied by two different groups by assuming elliptical (isothermal) and spherical (non-isothermal) beta models. The strict validity of the DD relation will occur only if the maximum value of eta(0) PDF is centered on eta(0) = 0. Results. It was found that the elliptical beta model is in good agreement with the data, showing no violation of the DD relation (PDF peaked close to eta(0) = 0 at 1 sigma), while the spherical (non-isothermal) one is only marginally compatible at 3 sigma. Conclusions. The present results derived by combining the SZE and X-ray surface brightness data from galaxy clusters with the latest WMAP results (7-years) favors the elliptical geometry for galaxy clusters. It is remarkable that a local property like the geometry of galaxy clusters might be constrained by a global argument provided by the cosmic DD relation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this Letter, we propose a new and model-independent cosmological test for the distance-duality (DD) relation, eta = D(L)(z)(1 + z)(-2)/D(A)(z) = 1, where D(L) and D(A) are, respectively, the luminosity and angular diameter distances. For D(L) we consider two sub-samples of Type Ia supernovae (SNe Ia) taken from Constitution data whereas D(A) distances are provided by two samples of galaxy clusters compiled by De Filippis et al. and Bonamente et al. by combining Sunyaev-Zeldovich effect and X-ray surface brightness. The SNe Ia redshifts of each sub-sample were carefully chosen to coincide with the ones of the associated galaxy cluster sample (Delta z < 0.005), thereby allowing a direct test of the DD relation. Since for very low redshifts, D(A)(z) approximate to D(L)(z), we have tested the DD relation by assuming that. is a function of the redshift parameterized by two different expressions: eta(z) = 1 + eta(0)z and eta(z) = 1 +eta(0)z/(1 + z), where eta(0) is a constant parameter quantifying a possible departure from the strict validity of the reciprocity relation (eta(0) = 0). In the best scenario (linear parameterization), we obtain eta(0) = -0.28(-0.44)(+0.44) (2 sigma, statistical + systematic errors) for the De Filippis et al. sample (elliptical geometry), a result only marginally compatible with the DD relation. However, for the Bonamente et al. sample (spherical geometry) the constraint is eta(0) = -0.42(-0.34)(+0.34) (3 sigma, statistical + systematic errors), which is clearly incompatible with the duality-distance relation.