885 resultados para Heteroskedasticity-based identification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: As universal screening of hypertension performs poorly in childhood, targeted screening to children at higher risk of hypertension has been proposed. Our goal was to assess the performance of combined parental history of hypertension and overweight/obesity to identify children with hypertension. We estimated the sensitivity, specificity, negative and positive predictive values of overweight/obesity and parental history of hypertension for the identification of hypertension in children. DESIGN AND METHOD: We analyzed data from a school-based cross-sectional study including 5207 children aged 10 to 14 years from all public 6th grade classes in the canton of Vaud, Switzerland. Blood pressure was measured with a clinically validated oscillometric automated device over up to three visits separated by one week. Children had hypertension if they had sustained elevated blood pressure over the three visits. Parents were interviewed about their history of hypertension. RESULTS: The prevalence of hypertension was 2.2%. 14% of children were overweight or obese and 20% had a positive history of hypertension in either or both parents. 30% of children had either or both conditions. After accounting for several potential confounding factors, parental history of hypertension (odds ratio (OR): 2.6; 95% confidence interval (CI): 1.8-4.0), overweight excluding obesity (OR: 2.5; 95% CI: 1.5-4.2) and obesity (OR: 10.1; 95% CI: 6.0-17.0) were associated with hypertension in children. Considered in isolation, the sensitivity and positive predictive values of parental history of hypertension (respectively 41% and 5%) or overweight/obesity (respectively 43% and 7%) were relatively low. Nevertheless, considered together, the sensitivity of targeted screening in children with either overweight/obesity or paternal history of hypertension was higher (65%) but the positive predictive value remained low (5%). The negative predictive value was systematically high. CONCLUSIONS: Restricting screening of hypertension to children with either overweight/obesity or with hypertensive parents would substantially limit the proportion of children to screen (30%) and allow the identification of a relatively large proportion (65%) of hypertensive cases. That could be a valuable alternative to universal screening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Gram negative aerobic flagellated bacterium with fungal growth inhibitory properties was isolated from a culture of Trichoderma harzianum. According to its cultural characteristics and biochemical properties it was identified as a strain of Alcaligenes (aeca/is Castellani and Chalmers. Antisera prepared in Balbc mice injected with live and heat-killed bacterial cells gave strong reactions with the homologous immunogen and with ATCC 15554, the type strain of A. taeca/is, but not with Escherichia coli or Enterobacter aerogens in immunoprecipitation and dot immunobinding assays. Growth of Botrytis cinerea Pers. and several other fungi was significantly affected when co-cultured with A. taeca/is on solid media. Its detrimental effect on germination and growth of B. cinerea has been found to be associated with antifungal substances produced by the bacterium and released into the growth medium. A biotest for the antibiotic substances, based on their inhibitory effect on germination of B. cinerea conidia, was developed. This biotest was used to study the properties of these substances, the conditions in which they are produced, and to monitor the steps of their separation during extraction procedures. It has been found that at least two substances could be involved in the antagonistic interaction. One of these is a basic volatile substance and has been identified as ammonia. The other substance is a nonvolatile, dialysable, heat stable, polar compound released into the growth medium. After separation of growth medium samples by Sephadex G-10 column chromatography a single peak with a molecular weight below 700 Daltons exhibited inhibitory activity. From its behaviour in electrophoretic separation in agarose gels it seems that this is a neutral or slightly positively charged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had three purposes related to the effective implem,entation and practice of computer-mediated online distance education (C-MODE) at the elementary level: (a) To identify a preliminary framework of criteria 'or guidelines for effective implementation and practice, (b) to identify areas ofC-MODE for which criteria or guidelines of effectiveness have not yet been developed, and (c) to develop an implementation and practice criteria questionnaire based on a review of the distance education literature, and to use the questionnaire in an exploratory survey of elementary C-MODE practitioners. Using the survey instrument, the beliefs and attitudes of 16 elementary C'- MODE practitioners about what constitutes effective implementation and practice principles were investigated. Respondents, who included both administrators and instructors, provided information about themselves and the program in which they worked. They rated 101 individual criteria statenlents on a 5 point Likert scale with a \. point range that included the values: 1 (Strongly Disagree), 2 (Disagree), 3 (Neutral or Undecided), 4 (Agree), 5 (Strongly Agree). Respondents also provided qualitative data by commenting on the individual statements, or suggesting other statements they considered important. Eighty-two different statements or guidelines related to the successful implementation and practice of computer-mediated online education at the elementary level were endorsed. Response to a small number of statements differed significantly by gender and years of experience. A new area for investigation, namely, the role ofparents, which has received little attention in the online distance education literature, emerged from the findings. The study also identified a number of other areas within an elementary context where additional research is necessary. These included: (a) differences in the factors that determine learning in a distance education setting and traditional settings, (b) elementary students' ability to function in an online setting, (c) the role and workload of instructors, (d) the importance of effective, timely communication with students and parents, and (e) the use of a variety of media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study probed for an answer to the question, "How do you identify as early as possible those students who are at risk of failing or dropping out of college so that intervention can take place?" by field testing two diagnostic instruments with a group of first semester Seneca College Computer Studies students. In some respects, the research approach was such as might be taken in a pilot study. Because of the complexity of the issue, this study did not attempt to go beyond discovery, understanding and description. Although some inferences may be drawn from the results of the study, no attempt was made to establish any causal relationship between or among the factors or variables represented here. Both quantitative and qualitative data were gathered during. four resea~ch phases: background, early identification, intervention, and evaluation. To gain a better understanding of the problem of student attrition within the School of Computer Studies at Seneca College, several methods were used, including retrospective analysis of enrollment statistics, faculty and student interviews and questionnaires, and tracking of the sample population. The significance of the problem was confirmed by the results of this study. The findings further confirmed the importance of the role of faculty in student retention and support the need to improve the quality of teacher/student interaction. As well, the need __f or ~~ills as~e:ss_~ent foll,,-~ed }JY supportiv e_c_ounsell~_I'l9_ ~~d_ __ placement was supported by the findings from this study. strategies for reducing student attrition were identified by faculty and students. As part of this study, a project referred to as "A Student Alert project" (ASAP) was undertaken at the School of Computer Studies at Seneca College. Two commercial diagnostic instruments, the Noel/Levitz College Student Inventory (CSI) and the Learning and Study Strategies Inventory (LASSI), provided quantitative data which were subsequently analyzed in Phase 4 in order to assess their usefulness as early identification tools. The findings show some support for using these instruments in a two-stage approach to early identification and intervention: the CSI as an early identification instrument and the LASSI as a counselling tool for those students who have been identified as being at risk. The findings from the preliminary attempts at intervention confirmed the need for a structured student advisement program where faculty are selected, trained, and recognized for their advisor role. Based on the finding that very few students acted on the diagnostic results and recommendations, the need for institutional intervention by way of intrusive measures was confirmed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study probed for an answer to the question, "How do you identify as early as possible those students who are at risk of failing or dropping out of college so that intervention can take place?" by field testing two diagnostic instruments with a group of first semester Seneca College Computer ,Studies students. In some respects, the research approach was such as might be taken in a pilot study_ Because of the complexity of the issue, this study did not attempt to go beyond discovery, understanding and description. Although some inferences may be drawn from the results of the study, no attempt was made to establish any causal relationship between or among the factors or variables represented here. Both quantitative and qualitative data were gathered during four resea~ch phases: background, early identification, intervention, and evaluation. To gain a better understanding of the problem of student attrition within the School of Computer Studies at Seneca College, several methods were used, including retrospective analysis of enrollment statistics, faculty and student interviews and questionnaires, and tracking of the sample population. The significance of the problem was confirmed by the results of this study. The findings further confirmed the importance of the role of faculty in student retention and support the need to improve the quality of teacher/student interaction. As well, the need for skills assessmen~-followed by supportive counselling, and placement was supported by the findings from this study. strategies for reducing student attrition were identified by faculty and students. As part of this study, a project referred to as "A Student Alert Project" (ASAP) was undertaken at the School of Computer Studies at Seneca college. Two commercial diagnostic instruments, the Noel/Levitz College Student Inventory (CSI) and the Learning and Study Strategies Inventory (LASSI), provided quantitative data which were subsequently analyzed in Phase 4 in order to assess their usefulness as early identification tools. The findings show some support for using these instruments in a two-stage approach to early identification and intervention: the CSI as an early identification instrument and the LASSI as a counselling tool for those students who have been identified as being at risk. The findings from the preliminary attempts at intervention confirmed the need for a structured student advisement program where faculty are selected, trained, and recognized for their advisor role. Based on the finding that very few students acted on the diagnostic results and recommendations, the need for institutional intervention by way of intrusive measures was confirmed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the present study was first to determine what influences international students' perceptions of prejudice, and secondly to examine how perceptions of prejudice would affect international students' group identification. Variables such as stigma vulnerability and contact which have been previously linked with perceptions of prejudice and intergroup relations were re-examined (Berryman-Fink, 2006; Gilbert, 1998; Nesdale & Todd, 2000), while variables classically linked to prejudicial attitudes such as right-wing authoritarianism and openness to experience were explored in relation to perceptions of prejudice. Furthermore, the study examined how perceptions of prejudice might affect the students' identification choices, by testing two opposing models. The first model was based on the motivational nature of social identity theory (Tajfel & Turner, 1986) while the second model was based on the cognitive nature of self-categorization theory/ rejection-identification model (Turner, Hogg, Oakes, Reicher, & Wetherell, 1987; Schmitt, Spears, & Branscombe,2003). It was hypothesized that stigma vulnerability, right-wing authoritarianism, openness to experience and contact would predict both personal and group perceptions of prejudice. It was also hypothesized that perceptions of prejudice would predict group identification. If the self-categorizationlrejection-identification model was supported, international students would identify with the international students. If the social mobility strategy was supported, international students would identify with the university students group. Participants were 98 international students who filled out questionnaires on the Brock University Psychology Department Website. The first hypothesis was supported. The combination of stigma vulnerability, right-wing authoritarianism, openness to experience and contact predicted both personal and group prejudice perceptions of international students. Furthermore, the analyses supported the self-categorizationlrejectionidentification model. International identification was predicted by the combination of personal and group prejudice perceptions of international students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing techniques involving hyperspectral imagery have applications in a number of sciences that study some aspects of the surface of the planet. The analysis of hyperspectral images is complex because of the large amount of information involved and the noise within that data. Investigating images with regard to identify minerals, rocks, vegetation and other materials is an application of hyperspectral remote sensing in the earth sciences. This thesis evaluates the performance of two classification and clustering techniques on hyperspectral images for mineral identification. Support Vector Machines (SVM) and Self-Organizing Maps (SOM) are applied as classification and clustering techniques, respectively. Principal Component Analysis (PCA) is used to prepare the data to be analyzed. The purpose of using PCA is to reduce the amount of data that needs to be processed by identifying the most important components within the data. A well-studied dataset from Cuprite, Nevada and a dataset of more complex data from Baffin Island were used to assess the performance of these techniques. The main goal of this research study is to evaluate the advantage of training a classifier based on a small amount of data compared to an unsupervised method. Determining the effect of feature extraction on the accuracy of the clustering and classification method is another goal of this research. This thesis concludes that using PCA increases the learning accuracy, and especially so in classification. SVM classifies Cuprite data with a high precision and the SOM challenges SVM on datasets with high level of noise (like Baffin Island).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conditional heteroskedasticity is an important feature of many macroeconomic and financial time series. Standard residual-based bootstrap procedures for dynamic regression models treat the regression error as i.i.d. These procedures are invalid in the presence of conditional heteroskedasticity. We establish the asymptotic validity of three easy-to-implement alternative bootstrap proposals for stationary autoregressive processes with m.d.s. errors subject to possible conditional heteroskedasticity of unknown form. These proposals are the fixed-design wild bootstrap, the recursive-design wild bootstrap and the pairwise bootstrap. In a simulation study all three procedures tend to be more accurate in small samples than the conventional large-sample approximation based on robust standard errors. In contrast, standard residual-based bootstrap methods for models with i.i.d. errors may be very inaccurate if the i.i.d. assumption is violated. We conclude that in many empirical applications the proposed robust bootstrap procedures should routinely replace conventional bootstrap procedures for autoregressions based on the i.i.d. error assumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that standard asymptotic theory is not valid or is extremely unreliable in models with identification problems or weak instruments [Dufour (1997, Econometrica), Staiger and Stock (1997, Econometrica), Wang and Zivot (1998, Econometrica), Stock and Wright (2000, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. One possible way out consists here in using a variant of the Anderson-Rubin (1949, Ann. Math. Stat.) procedure. The latter, however, allows one to build exact tests and confidence sets only for the full vector of the coefficients of the endogenous explanatory variables in a structural equation, which in general does not allow for individual coefficients. This problem may in principle be overcome by using projection techniques [Dufour (1997, Econometrica), Dufour and Jasiak (2001, International Economic Review)]. AR-types are emphasized because they are robust to both weak instruments and instrument exclusion. However, these techniques can be implemented only by using costly numerical techniques. In this paper, we provide a complete analytic solution to the problem of building projection-based confidence sets from Anderson-Rubin-type confidence sets. The latter involves the geometric properties of “quadrics” and can be viewed as an extension of usual confidence intervals and ellipsoids. Only least squares techniques are required for building the confidence intervals. We also study by simulation how “conservative” projection-based confidence sets are. Finally, we illustrate the methods proposed by applying them to three different examples: the relationship between trade and growth in a cross-section of countries, returns to education, and a study of production functions in the U.S. economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we use identification-robust methods to assess the empirical adequacy of a New Keynesian Phillips Curve (NKPC) equation. We focus on the Gali and Gertler’s (1999) specification, on both U.S. and Canadian data. Two variants of the model are studied: one based on a rationalexpectations assumption, and a modification to the latter which consists in using survey data on inflation expectations. The results based on these two specifications exhibit sharp differences concerning: (i) identification difficulties, (ii) backward-looking behavior, and (ii) the frequency of price adjustments. Overall, we find that there is some support for the hybrid NKPC for the U.S., whereas the model is not suited to Canada. Our findings underscore the need for employing identificationrobust inference methods in the estimation of expectations-based dynamic macroeconomic relations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La croissance de deux tiers des tumeurs mammaires dépend des œstrogènes. Le réseau de gènes responsable de propager les signaux prolifératifs des œstrogènes est encore mal connu. Des micropuces d’ADN de cellules de carcinome mammaire MCF7 traitées à l’œstradiol (E2) avec ou sans l’inhibiteur de synthèse protéique cycloheximide (CHX) ont permis d’identifier de nombreux gènes cibles primaires et secondaires. La séquence des promoteurs des gènes cibles a été criblée à l’aide d’une banque de 300 matrices modélisant les sites reconnus par divers facteurs de transcription. Les éléments de réponse aux œstrogènes (ERE) sont enrichis dans les promoteurs des gènes primaires. Les sites E2F sont enrichis dans les promoteurs des gènes cible secondaires. Un enrichissement similaire a été observé avec les régions liées par ERα et E2F1 en ChIP-on-chip pour chacune des catégories de gènes. La croissance des cellules de carcinome mammaire est inhibée par des traitements à l’acide rétinoïque (RA). L’analyse de micropuces d’ADN de MCF7 traitées avec RA a permis d’identifier de nombreux gènes cibles potentiels. Un enrichissement d’éléments de réponse à l’acide rétinoïque (RARE) est observable dans les promoteurs de ces gènes après avoir exclus les RARE se trouvant à l’intérieur d’éléments transposables. Des RARE présents dans des éléments transposables spécifiques aux primates sont aussi fixés in vivo dans les promoteurs de cibles connues de RA : BTG2, CASP9 et GPRC5A. Certains gènes cibles de RA dans les MCF7 sont aussi des cibles de E2, suggérant que le contrôle que ces molécules exercent sur la prolifération est en partie attribuable à des effets opposés sur un ensemble commun de gènes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La dernière décennie a connu un intérêt croissant pour les problèmes posés par les variables instrumentales faibles dans la littérature économétrique, c’est-à-dire les situations où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter. En effet, il est bien connu que lorsque les instruments sont faibles, les distributions des statistiques de Student, de Wald, du ratio de vraisemblance et du multiplicateur de Lagrange ne sont plus standard et dépendent souvent de paramètres de nuisance. Plusieurs études empiriques portant notamment sur les modèles de rendements à l’éducation [Angrist et Krueger (1991, 1995), Angrist et al. (1999), Bound et al. (1995), Dufour et Taamouti (2007)] et d’évaluation des actifs financiers (C-CAPM) [Hansen et Singleton (1982,1983), Stock et Wright (2000)], où les variables instrumentales sont faiblement corrélées avec la variable à instrumenter, ont montré que l’utilisation de ces statistiques conduit souvent à des résultats peu fiables. Un remède à ce problème est l’utilisation de tests robustes à l’identification [Anderson et Rubin (1949), Moreira (2002), Kleibergen (2003), Dufour et Taamouti (2007)]. Cependant, il n’existe aucune littérature économétrique sur la qualité des procédures robustes à l’identification lorsque les instruments disponibles sont endogènes ou à la fois endogènes et faibles. Cela soulève la question de savoir ce qui arrive aux procédures d’inférence robustes à l’identification lorsque certaines variables instrumentales supposées exogènes ne le sont pas effectivement. Plus précisément, qu’arrive-t-il si une variable instrumentale invalide est ajoutée à un ensemble d’instruments valides? Ces procédures se comportent-elles différemment? Et si l’endogénéité des variables instrumentales pose des difficultés majeures à l’inférence statistique, peut-on proposer des procédures de tests qui sélectionnent les instruments lorsqu’ils sont à la fois forts et valides? Est-il possible de proposer les proédures de sélection d’instruments qui demeurent valides même en présence d’identification faible? Cette thèse se focalise sur les modèles structurels (modèles à équations simultanées) et apporte des réponses à ces questions à travers quatre essais. Le premier essai est publié dans Journal of Statistical Planning and Inference 138 (2008) 2649 – 2661. Dans cet essai, nous analysons les effets de l’endogénéité des instruments sur deux statistiques de test robustes à l’identification: la statistique d’Anderson et Rubin (AR, 1949) et la statistique de Kleibergen (K, 2003), avec ou sans instruments faibles. D’abord, lorsque le paramètre qui contrôle l’endogénéité des instruments est fixe (ne dépend pas de la taille de l’échantillon), nous montrons que toutes ces procédures sont en général convergentes contre la présence d’instruments invalides (c’est-à-dire détectent la présence d’instruments invalides) indépendamment de leur qualité (forts ou faibles). Nous décrivons aussi des cas où cette convergence peut ne pas tenir, mais la distribution asymptotique est modifiée d’une manière qui pourrait conduire à des distorsions de niveau même pour de grands échantillons. Ceci inclut, en particulier, les cas où l’estimateur des double moindres carrés demeure convergent, mais les tests sont asymptotiquement invalides. Ensuite, lorsque les instruments sont localement exogènes (c’est-à-dire le paramètre d’endogénéité converge vers zéro lorsque la taille de l’échantillon augmente), nous montrons que ces tests convergent vers des distributions chi-carré non centrées, que les instruments soient forts ou faibles. Nous caractérisons aussi les situations où le paramètre de non centralité est nul et la distribution asymptotique des statistiques demeure la même que dans le cas des instruments valides (malgré la présence des instruments invalides). Le deuxième essai étudie l’impact des instruments faibles sur les tests de spécification du type Durbin-Wu-Hausman (DWH) ainsi que le test de Revankar et Hartley (1973). Nous proposons une analyse en petit et grand échantillon de la distribution de ces tests sous l’hypothèse nulle (niveau) et l’alternative (puissance), incluant les cas où l’identification est déficiente ou faible (instruments faibles). Notre analyse en petit échantillon founit plusieurs perspectives ainsi que des extensions des précédentes procédures. En effet, la caractérisation de la distribution de ces statistiques en petit échantillon permet la construction des tests de Monte Carlo exacts pour l’exogénéité même avec les erreurs non Gaussiens. Nous montrons que ces tests sont typiquement robustes aux intruments faibles (le niveau est contrôlé). De plus, nous fournissons une caractérisation de la puissance des tests, qui exhibe clairement les facteurs qui déterminent la puissance. Nous montrons que les tests n’ont pas de puissance lorsque tous les instruments sont faibles [similaire à Guggenberger(2008)]. Cependant, la puissance existe tant qu’au moins un seul instruments est fort. La conclusion de Guggenberger (2008) concerne le cas où tous les instruments sont faibles (un cas d’intérêt mineur en pratique). Notre théorie asymptotique sous les hypothèses affaiblies confirme la théorie en échantillon fini. Par ailleurs, nous présentons une analyse de Monte Carlo indiquant que: (1) l’estimateur des moindres carrés ordinaires est plus efficace que celui des doubles moindres carrés lorsque les instruments sont faibles et l’endogenéité modérée [conclusion similaire à celle de Kiviet and Niemczyk (2007)]; (2) les estimateurs pré-test basés sur les tests d’exogenété ont une excellente performance par rapport aux doubles moindres carrés. Ceci suggère que la méthode des variables instrumentales ne devrait être appliquée que si l’on a la certitude d’avoir des instruments forts. Donc, les conclusions de Guggenberger (2008) sont mitigées et pourraient être trompeuses. Nous illustrons nos résultats théoriques à travers des expériences de simulation et deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le problème bien connu du rendement à l’éducation. Le troisième essai étend le test d’exogénéité du type Wald proposé par Dufour (1987) aux cas où les erreurs de la régression ont une distribution non-normale. Nous proposons une nouvelle version du précédent test qui est valide même en présence d’erreurs non-Gaussiens. Contrairement aux procédures de test d’exogénéité usuelles (tests de Durbin-Wu-Hausman et de Rvankar- Hartley), le test de Wald permet de résoudre un problème courant dans les travaux empiriques qui consiste à tester l’exogénéité partielle d’un sous ensemble de variables. Nous proposons deux nouveaux estimateurs pré-test basés sur le test de Wald qui performent mieux (en terme d’erreur quadratique moyenne) que l’estimateur IV usuel lorsque les variables instrumentales sont faibles et l’endogénéité modérée. Nous montrons également que ce test peut servir de procédure de sélection de variables instrumentales. Nous illustrons les résultats théoriques par deux applications empiriques: le modèle bien connu d’équation du salaire [Angist et Krueger (1991, 1999)] et les rendements d’échelle [Nerlove (1963)]. Nos résultats suggèrent que l’éducation de la mère expliquerait le décrochage de son fils, que l’output est une variable endogène dans l’estimation du coût de la firme et que le prix du fuel en est un instrument valide pour l’output. Le quatrième essai résout deux problèmes très importants dans la littérature économétrique. D’abord, bien que le test de Wald initial ou étendu permette de construire les régions de confiance et de tester les restrictions linéaires sur les covariances, il suppose que les paramètres du modèle sont identifiés. Lorsque l’identification est faible (instruments faiblement corrélés avec la variable à instrumenter), ce test n’est en général plus valide. Cet essai développe une procédure d’inférence robuste à l’identification (instruments faibles) qui permet de construire des régions de confiance pour la matrices de covariances entre les erreurs de la régression et les variables explicatives (possiblement endogènes). Nous fournissons les expressions analytiques des régions de confiance et caractérisons les conditions nécessaires et suffisantes sous lesquelles ils sont bornés. La procédure proposée demeure valide même pour de petits échantillons et elle est aussi asymptotiquement robuste à l’hétéroscédasticité et l’autocorrélation des erreurs. Ensuite, les résultats sont utilisés pour développer les tests d’exogénéité partielle robustes à l’identification. Les simulations Monte Carlo indiquent que ces tests contrôlent le niveau et ont de la puissance même si les instruments sont faibles. Ceci nous permet de proposer une procédure valide de sélection de variables instrumentales même s’il y a un problème d’identification. La procédure de sélection des instruments est basée sur deux nouveaux estimateurs pré-test qui combinent l’estimateur IV usuel et les estimateurs IV partiels. Nos simulations montrent que: (1) tout comme l’estimateur des moindres carrés ordinaires, les estimateurs IV partiels sont plus efficaces que l’estimateur IV usuel lorsque les instruments sont faibles et l’endogénéité modérée; (2) les estimateurs pré-test ont globalement une excellente performance comparés à l’estimateur IV usuel. Nous illustrons nos résultats théoriques par deux applications empiriques: la relation entre le taux d’ouverture et la croissance économique et le modèle de rendements à l’éducation. Dans la première application, les études antérieures ont conclu que les instruments n’étaient pas trop faibles [Dufour et Taamouti (2007)] alors qu’ils le sont fortement dans la seconde [Bound (1995), Doko et Dufour (2009)]. Conformément à nos résultats théoriques, nous trouvons les régions de confiance non bornées pour la covariance dans le cas où les instruments sont assez faibles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les papillomavirus sont des virus à ADN qui infectent la peau et les muqueuses. Ils causent des verrues et peuvent aussi mener au développement de cancers, dont le cancer du col de l’utérus. La réplication de leur génome nécessite deux protéines virales : l’hélicase E1 et le facteur de transcription E2, qui recrute E1 à l’origine de réplication virale. Pour faciliter l’étude de la réplication du génome viral, un essai quantitatif et à haut débit basé sur l’expression de la luciférase a été développé. Parallèlement, un domaine de transactivation a été identifié dans la région régulatrice N-terminale de la protéine E1. La caractérisation de ce domaine a montré que son intégrité est importante pour la réplication de l’ADN. Cette étude suggère que le domaine de transactivation de E1 est une région protéique intrinsèquement désordonnée qui permet la régulation de la réplication du génome viral par son interaction avec diverses protéines.