867 resultados para Panel-data econometrics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although p53-gene mutations occur with significant frequency in diffuse low-grade and high-grade astrocytomas, and are postulated to play an important role in tumorigenesis in these cases, the role of the p53 gene in pilocytic astrocytomas remains unclear. Published data using DNA-based assays for p53-gene analysis in these tumors have shown contradictory results in mutation frequency (0-14%). It is not known whether these heterogeneous results stem from the biological diversity of this tumor group or from technical problems. To re-evaluate p53-gene status in pilocytic tumors, we analyzed 18 tumors chosen to represent the clinical and biological heterogeneity of this tumor type with respect to anatomical location, patient age, gender, ethnic origin (Caucasian or Japanese) and the concomitant occurrence of neurofibromatosis type 1 (NF1). All primary tumors were histologically diagnosed as pilocytic astrocytoma (WHO grade I), except for one anaplastic pilocytic astrocytoma (WHO grade III) which developed in an NF1 patient and recurred as glioblastoma multiforme (WHO grade IV). p53 mutations were detected using an assay in yeast which tests the transcriptional activity of p53 proteins synthesized from tumor mRNA-derived p53-cDNA templates. None of 18 tumors, including 3 NF1-related tumors, showed p53-gene mutations between and including exons 4 and 11. We conclude that p53-gene mutations are extremely rare findings in pilocytic astrocytomas, and are absent even in those exceptional cases in which malignant progression of such tumors has occurred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Biological. therapy has dramatically changed management of Crohn's disease (CD). New data have confirmed the benefit and relative long-term safety of anti-TNF alpha inhibition as part of a regular scheduled administration programme. The EPACT appropriateness criteria for maintenance treatment after medically-induced remission (MIR) or surgically-induced remission (SIR) of CD thus required updating. Methods: A multidisciplinary international expert panel (EPACT II, Geneva, Switzerland) discussed and anonymously rated detailed, explicit clinical indications based on evidence in the literature and personal expertise. Median ratings (on a 9-point scale) were stratified into three assessment categories: appropriate (7-9), uncertain (4-6 and/or disagreement) and inappropriate (1-3). Experts ranked appropriate medication according to their own clinical practice, without any consideration of cost. Results: Three hundred and ninety-two specific indications for maintenance treatment of CD were rated (200 for MIR and 192 for SIR). Azathioprine, methotrexate and/or anti-TNF alpha antibodies were considered appropriate in 42 indications, corresponding to 68% of all appropriate interventions (97% of MIR and 39% of SIR). The remaining appropriate interventions consisted of mesalazine and a "wait-and-see" strategy. Factors that influenced the panel's voting were patient characteristics and outcome of previous treatment. Results favour use of anti-TNF alpha agents after failure of any immunosuppressive therapy, while earlier primary use remains controversial. Conclusion: Detailed explicit appropriateness criteria (EPACT) have been updated for maintenance treatment of CD. New expert recommendations for use of the classic immunosuppressors as well as anti-TNF alpha agents are now freely available online (www.epact.ch). The validity of these criteria should now be tested by prospective evaluation. (C) 2009 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Especially in panel surveys, respondent attrition, respondent learning, and interviewer experience effects play a crucial role with respect to data quality. We examine three interview survey quality indicators in the same survey in a cross sectional as well as in a longitudinal way. In the cross sectional analysis we compare data quality in the mature original sample with that in a refreshment sample, surveyed in the same wave. Because in the same wave an interviewer survey was conducted, collecting attitudes on their socio demography, survey attitudes and burden measures, we are able to consider interviewer fixed effects as well. The longitudinal analysis gives more insight in the respondent learning effects with respect to the quality indicators considered by considering the very same respondents across waves. The Swiss Household Panel, a CATI survey representative of the Swiss residential population, forms an ideal modelling database: the interviewer - respondent assignment is random, both within and across waves. This design avoids possible confusion with other effects stemming from a non-random assignment of interviewers, e.g. area effects or effects from assigning the best interviewers to the hard cases. In order to separate interviewer, respondent and wave effects, we build cross-classified multilevel models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reference collections of multiple Drosophila lines with accumulating collections of "omics" data have proven especially valuable for the study of population genetics and complex trait genetics. Here we present a description of a resource collection of 84 strains of Drosophila melanogaster whose genome sequences were obtained after 12 generations of full-sib inbreeding. The initial rationale for this resource was to foster development of a systems biology platform for modeling metabolic regulation by the use of natural polymorphisms as perturbations. As reference lines, they are amenable to repeated phenotypic measurements, and already a large collection of metabolic traits have been assayed. Another key feature of these strains is their widespread geographic origin, coming from Beijing, Ithaca, Netherlands, Tasmania, and Zimbabwe. After obtaining 12.5× coverage of paired-end Illumina sequence reads, SNP and indel calls were made with the GATK platform. Thorough quality control was enabled by deep sequencing one line to >100×, and single-nucleotide polymorphisms and indels were validated using ddRAD-sequencing as an orthogonal platform. In addition, a series of preliminary population genetic tests were performed with these single-nucleotide polymorphism data for assessment of data quality. We found 83 segregating inversions among the lines, and as expected these were especially abundant in the African sample. We anticipate that this will make a useful addition to the set of reference D. melanogaster strains, thanks to its geographic structuring and unusually high level of genetic diversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colorectal cancer (CRC) is the second leading cause of cancer-related death in developed countries. Early detection of CRC leads to decreased CRC mortality. A blood-based CRC screening test is highly desirable due to limited invasiveness and high acceptance rate among patients compared to currently used fecal occult blood testing and colonoscopy. Here we describe the discovery and validation of a 29-gene panel in peripheral blood mononuclear cells (PBMC) for the detection of CRC and adenomatous polyps (AP). Blood samples were prospectively collected from a multicenter, case-control clinical study. First, we profiled 93 samples with 667 candidate and 3 reference genes by high throughput real-time PCR (OpenArray system). After analysis, 160 genes were retained and tested again on 51 additional samples. Low expressed and unstable genes were discarded resulting in a final dataset of 144 samples profiled with 140 genes. To define which genes, alone or in combinations had the highest potential to discriminate AP and/or CRC from controls, data were analyzed by a combination of univariate and multivariate methods. A list of 29 potentially discriminant genes was compiled and evaluated for its predictive accuracy by penalized logistic regression and bootstrap. This method discriminated AP >1cm and CRC from controls with a sensitivity of 59% and 75%, respectively, with 91% specificity. The behavior of the 29-gene panel was validated with a LightCycler 480 real-time PCR platform, commonly adopted by clinical laboratories. In this work we identified a 29-gene panel expressed in PBMC that can be used for developing a novel minimally-invasive test for accurate detection of AP and CRC using a standard real-time PCR platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Meese-Rogoff forecasting puzzle states that foreign exchange (FX) rates are unpredictable. Since one country’s macroeconomic conditions could affect the price of its national currency, we study the dynamic relations between the FX rates and some macroeconomic accounts. Our research tests whether the predictability of the FX rates could be improved through the advanced econometrics. Improving the predictability of the FX rates has important implications for various groups including investors, business entities and the government. The present thesis examines the dynamic relations between the FX rates, savings and investments for a sample of 25 countries from the Organization for Economic Cooperation and Development. We apply quarterly data of FX rates, macroeconomic indices and accounts including the savings and the investments over three decades. Through preliminary Augmented Dickey-Fuller unit root tests and Johansen cointegration tests, we found that the savings rate and the investment rate are cointegrated with the vector (1,-1). This result is consistent with many previous studies on the savings-investment relations and therefore confirms the validity of the Feldstein-Horioka puzzle. Because of the special cointegrating relation between the savings rate and investment rate, we introduce the savings-investment rate differential (SID). Investigating each country through a vector autoregression (VAR) model, we observe extremely insignificant coefficient estimates of the historical SIDs upon the present FX rates. We also report similar findings through the panel VAR approach. We thus conclude that the historical SIDs are useless in forecasting the FX rate. Nonetheless, the coefficients of the past FX rates upon the current SIDs for both the country-specific and the panel VAR models are statistically significant. Therefore, we conclude that the historical FX rates can conversely predict the SID to some degree. Specifically, depreciation in the domestic currency would cause the increase in the SID.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contexte. Les phénotypes ABO et Rh(D) des donneurs de sang ainsi que des patients transfusés sont analysés de façon routinière pour assurer une complète compatibilité. Ces analyses sont accomplies par agglutination suite à une réaction anticorps-antigènes. Cependant, pour des questions de coûts et de temps d’analyses faramineux, les dons de sang ne sont pas testés sur une base routinière pour les antigènes mineurs du sang. Cette lacune peut résulter à une allo-immunisation des patients receveurs contre un ou plusieurs antigènes mineurs et ainsi amener des sévères complications pour de futures transfusions. Plan d’étude et Méthodes. Pour ainsi aborder le problème, nous avons produit un panel génétique basé sur la technologie « GenomeLab _SNPstream» de Beckman Coulter, dans l’optique d’analyser simultanément 22 antigènes mineurs du sang. La source d’ADN provient des globules blancs des patients préalablement isolés sur papiers FTA. Résultats. Les résultats démontrent que le taux de discordance des génotypes, mesuré par la corrélation des résultats de génotypage venant des deux directions de l’ADN, ainsi que le taux d’échec de génotypage sont très bas (0,1%). Également, la corrélation entre les résultats de phénotypes prédit par génotypage et les phénotypes réels obtenus par sérologie des globules rouges et plaquettes sanguines, varient entre 97% et 100%. Les erreurs expérimentales ou encore de traitement des bases de données ainsi que de rares polymorphismes influençant la conformation des antigènes, pourraient expliquer les différences de résultats. Cependant, compte tenu du fait que les résultats de phénotypages obtenus par génotypes seront toujours co-vérifiés avant toute transfusion sanguine par les technologies standards approuvés par les instances gouvernementales, les taux de corrélation obtenus sont de loin supérieurs aux critères de succès attendus pour le projet. Conclusion. Le profilage génétique des antigènes mineurs du sang permettra de créer une banque informatique centralisée des phénotypes des donneurs, permettant ainsi aux banques de sang de rapidement retrouver les profiles compatibles entre les donneurs et les receveurs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The attached file is created with Scientific Workplace Latex

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse est organisée en trois chapitres. Les deux premiers s'intéressent à l'évaluation, par des méthodes d'estimations, de l'effet causal ou de l'effet d'un traitement, dans un environnement riche en données. Le dernier chapitre se rapporte à l'économie de l'éducation. Plus précisément dans ce chapitre j'évalue l'effet de la spécialisation au secondaire sur le choix de filière à l'université et la performance. Dans le premier chapitre, j'étudie l'estimation efficace d'un paramètre de dimension finie dans un modèle linéaire où le nombre d'instruments peut être très grand ou infini. L'utilisation d'un grand nombre de conditions de moments améliore l'efficacité asymptotique des estimateurs par variables instrumentales, mais accroit le biais. Je propose une version régularisée de l'estimateur LIML basée sur trois méthodes de régularisations différentes, Tikhonov, Landweber Fridman, et composantes principales, qui réduisent le biais. Le deuxième chapitre étend les travaux précédents, en permettant la présence d'un grand nombre d'instruments faibles. Le problème des instruments faibles est la consequence d'un très faible paramètre de concentration. Afin d'augmenter la taille du paramètre de concentration, je propose d'augmenter le nombre d'instruments. Je montre par la suite que les estimateurs 2SLS et LIML régularisés sont convergents et asymptotiquement normaux. Le troisième chapitre de cette thèse analyse l'effet de la spécialisation au secondaire sur le choix de filière à l'université. En utilisant des données américaines, j'évalue la relation entre la performance à l'université et les différents types de cours suivis pendant les études secondaires. Les résultats suggèrent que les étudiants choisissent les filières dans lesquelles ils ont acquis plus de compétences au secondaire. Cependant, on a une relation en U entre la diversification et la performance à l'université, suggérant une tension entre la spécialisation et la diversification. Le compromis sous-jacent est évalué par l'estimation d'un modèle structurel de l'acquisition du capital humain au secondaire et de choix de filière. Des analyses contrefactuelles impliquent qu'un cours de plus en matière quantitative augmente les inscriptions dans les filières scientifiques et technologiques de 4 points de pourcentage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Literature on scoliosis screening is vast, however because of the observational nature of available data and methodological flaws, data interpretation is often complex, leading to incomplete and sometimes, somewhat misleading conclusions. The need to propose a set of methods for critical appraisal of the literature about scoliosis screening, a comprehensive summary and rating of the available evidence appeared essential. METHODS: To address these gaps, the study aims were: i) To propose a framework for the assessment of published studies on scoliosis screening effectiveness; ii) To suggest specific questions to be answered on screening effectiveness instead of trying to reach a global position for or against the programs; iii) To contextualize the knowledge through expert panel consultation and meaningful recommendations. The general methodological approach proceeds through the following steps: Elaboration of the conceptual framework; Formulation of the review questions; Identification of the criteria for the review; Selection of the studies; Critical assessment of the studies; Results synthesis; Formulation and grading of recommendations in response to the questions. This plan follows at best GRADE Group (Grades of Recommendation, Assessment, Development and Evaluation) requirements for systematic reviews, assessing quality of evidence and grading the strength of recommendations. CONCLUSIONS: In this article, the methods developed in support of this work are presented since they may be of some interest for similar reviews in scoliosis and orthopaedic fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education