873 resultados para power of association


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A marker that is strongly associated with outcome (or disease) is often assumed to be effective for classifying individuals according to their current or future outcome. However, for this to be true, the associated odds ratio must be of a magnitude rarely seen in epidemiological studies. An illustration of the relationship between odds ratios and receiver operating characteristic (ROC) curves shows, for example, that a marker with an odds ratio as high as 3 is in fact a very poor classification tool. If a marker identifies 10 percent of controls as positive (false positives) and has an odds ratio of 3, then it will only correctly identify 25 percent of cases as positive (true positives). Moreover, the authors illustrate that a single measure of association such as an odds ratio does not meaningfully describe a marker’s ability to classify subjects. Appropriate statistical methods for assessing and reporting the classification power of a marker are described. The serious pitfalls of using more traditional methods based on parameters in logistic regression models are illustrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lung function measures are heritable, predict mortality and are relevant in diagnosis of chronic obstructive pulmonary disease (COPD). COPD and asthma are diseases of the airways with major public health impacts and each have a heritable component. Genome-wide association studies of SNPs have revealed novel genetic associations with both diseases but only account for a small proportion of the heritability. Complex copy number variation may account for some of the missing heritability. A well-characterised genomic region of complex copy number variation contains beta-defensin genes (DEFB103, DEFB104 and DEFB4), which have a role in the innate immune response. Previous studies have implicated these and related genes as being associated with asthma or COPD. We hypothesised that copy number variation of these genes may play a role in lung function in the general population and in COPD and asthma risk. We undertook copy number typing of this locus in 1149 adult and 689 children using a paralogue ratio test and investigated association with COPD, asthma and lung function. Replication of findings was assessed in a larger independent sample of COPD cases and smoking controls. We found evidence for an association of beta-defensin copy number with COPD in the adult cohort (OR = 1.4, 95%CI:1.02-1.92, P = 0.039) but this finding, and findings from a previous study, were not replicated in a larger follow-up sample(OR = 0.89, 95%CI:0.72-1.07, P = 0.217). No robust evidence of association with asthma in children was observed. We found no evidence for association between beta-defensin copy number and lung function in the general populations. Our findings suggest that previous reports of association of beta-defensin copy number with COPD should be viewed with caution. Suboptimal measurement of copy number can lead to spurious associations. Further beta-defensin copy number measurement in larger sample sizes of COPD cases and children with asthma are needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compared with term-born infants, preterm infants have increased respiratory morbidity in the first year of life. We investigated whether lung function tests performed near term predict subsequent respiratory morbidity during the first year of life and compared this to standard clinical parameters in preterms.The prospective birth cohort included randomly selected preterm infants with and without bronchopulmonary dysplasia. Lung function (tidal breathing and multiple-breath washout) was measured at 44 weeks post-menstrual age during natural sleep. We assessed respiratory morbidity (wheeze, hospitalisation, inhalation and home oxygen therapy) after 1 year using a standardised questionnaire. We first assessed the association between lung function and subsequent respiratory morbidity. Secondly, we compared the predictive power of standard clinical predictors with and without lung function data.In 166 preterm infants, tidal volume, time to peak tidal expiratory flow/expiratory time ratio and respiratory rate were significantly associated with subsequent wheeze. In comparison with standard clinical predictors, lung function did not improve the prediction of later respiratory morbidity in an individual child.Although associated with later wheeze, noninvasive infant lung function shows large physiological variability and does not add to clinically relevant risk prediction for subsequent respiratory morbidity in an individual preterm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: The aim was to investigate the influence of increment thickness on shear bond strength (SBS) to dentin of a conventional and two bulk fill flowable composites. Methods: A total of 135 specimens of ground human dentin were produced (n=15/group; 3 increment thicknesses; 3 flowable composites) and the dentin surfaces were treated with the adhesive system OptiBond FL (Kerr) according to manufacturer’s instructions. Split Teflon molds (inner diameter: 3.6 mm) of 2 mm, 4 mm, or 6 mm height allowing three increment thicknesses were clamped on the dentin surfaces and filled with either the conventional flowable Filtek Supreme XTE ((XTE); 3M ESPE) or the bulk fill flowables Filtek Bulk Fill ((FBF); 3M ESPE) or SDR ((SDR); DENTSPLY Caulk). The flowable composites were light-cured for 20 s (Demi LED; Kerr) and the specimens stored for 24 h (37°C, 100% humidity). Specimens were then subjected to a SBS-test in a universal testing machine at a cross-head speed of 1 mm/min (Zwick Z010; Zwick GmbH & Co.). SBS-values were statistically analysed with a nonparametrical ANOVA followed by exact Wilcoxon rank sum tests (α=0.05). Failure mode of the specimens was determined under a stereomicroscope at 25× magnification. Results: SBS-values (MPa) at 2 mm/4 mm/6 mm increment thicknesses (mean value [standard deviation]) were for XTE: 18.8 [2.6]/17.6 [1.6]/16.7 [3.1], for FBF: 20.6 [2.7]/17.8 [2.7]/18.7 [2.9], and for SDR: 21.7 [2.6]/18.5 [2.6]/20.3 [3.0]. For all three flowable composites, 2 mm increments yielded the highest SBS-values whereas for increments of 4 mm and 6 mm no differences were detected. All specimens presented failure modes involving cohesive failure in dentin. Conclusion: The influence of increment thickness on dentin SBS was less pronounced than expected. However, the high number of cohesive failures in dentin, reflecting the efficiency of the adhesive system, suggests a limited discriminatory power of the SBS-test.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The Valve Academic Research Consortium (VARC) has proposed a standardized definition of bleeding in patients undergoing transcatheter aortic valve interventions (TAVI). The VARC bleeding definition has not been validated or compared to other established bleeding definitions so far. Thus, we aimed to investigate the impact of bleeding and compare the predictivity of VARC bleeding events with established bleeding definitions. METHODS AND RESULTS Between August 2007 and April 2012, 489 consecutive patients with severe aortic stenosis were included into the Bern-TAVI-Registry. Every bleeding complication was adjudicated according to the definitions of VARC, BARC, TIMI, and GUSTO. Periprocedural blood loss was added to the definition of VARC, providing a modified VARC definition. A total of 152 bleeding events were observed during the index hospitalization. Bleeding severity according to VARC was associated with a gradual increase in mortality, which was comparable to the BARC, TIMI, GUSTO, and the modified VARC classifications. The predictive precision of a multivariable model for mortality at 30 days was significantly improved by adding the most serious bleeding of VARC (area under the curve [AUC], 0.773; 95% confidence interval [CI], 0.706 to 0.839), BARC (AUC, 0.776; 95% CI, 0.694 to 0.857), TIMI (AUC, 0.768; 95% CI, 0.692 to 0.844), and GUSTO (AUC, 0.791; 95% CI, 0.714 to 0.869), with the modified VARC definition resulting in the best predictivity (AUC, 0.814; 95% CI, 0.759 to 0.870). CONCLUSIONS The VARC bleeding definition offers a severity stratification that is associated with a gradual increase in mortality and prognostic information comparable to established bleeding definitions. Adding the information of periprocedural blood loss to VARC may increase the sensitivity and the predictive power of this classification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The Endoscopic Release of Carpal Tunnel Syndrome (ECTR) is a minimal invasive approach for the treatment of Carpal Tunnel Syndrome. There is scepticism regarding the safety of this technique, based on the assumption that this is a rather "blind" procedure and on the high number of severe complications that have been reported in the literature. PURPOSE To evaluate whether there is evidence supporting a higher risk after ECTR in comparison to the conventional open release. METHODS We searched MEDLINE (January 1966 to November 2013), EMBASE (January 1980 to November 2013), the Cochrane Neuromuscular Disease Group Specialized Register (November 2013) and CENTRAL (2013, issue 11 in The Cochrane Library). We hand-searched reference lists of included studies. We included all randomized or quasi-randomized controlled trials (e.g. study using alternation, date of birth, or case record number) that compare any ECTR with any OCTR technique. Safety was assessed by the incidence of major, minor and total number of complications, recurrences, and re-operations.The total time needed before return to work or to return to daily activities was also assessed. We synthesized data using a random-effects meta-analysis in STATA. We conducted a sensitivity analysis for rare events using binomial likelihood. We judged the conclusiveness of meta-analysis calculating the conditional power of meta-analysis. CONCLUSIONS ECTR is associated with less time off work or with daily activities. The assessment of major complications, reoperations and recurrence of symptoms does not favor either of the interventions. There is an uncertain advantage of ECTR with respect to total minor complications (more transient paresthesia but fewer skin-related complications). Future studies are unlikely to alter these findings because of the rarity of the outcome. The effect of a learning curve might be responsible for reduced recurrences and reoperations with ECTR in studies that are more recent, although formal statistical analysis failed to provide evidence for such an association. LEVEL OF EVIDENCE I.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Several studies have proposed a link between type 2 Diabetes mellitus (DM2) and Hepatitis C infection (HCV) with conflicting results. Since DM2 and HCV have high prevalence, establishing a link between the two may guide further studies aimed at DM2 prevention. A systematic review was conducted to estimate the magnitude and direction of association between DM2 and HCV. Temporality was assessed from cohort studies and case-control studies where such information was available. ^ Methods. MEDLINE searches were conducted for studies that provided risk estimates and fulfill criteria regarding the definition of exposure (HCV) and outcomes (DM2). HCV was defined in terms of method of diagnosis, laboratory technique and method of data collection; DM2 was defined in terms of the classification [World Health Organization (WHO) and American Diabetes Association (ADA)] 1-3 used for diagnosis, laboratory technique and method of data collection. Standardized searches and data abstraction for construction of tables was performed. Unadjusted or adjusted measures of association for individual studies were obtained or calculated from the full text of the studies. Template designed by Dr. David Ramsey. ^ Results. Forty-six studies out of one hundred and nine potentially eligible articles finally met the inclusion and exclusion criteria and were classified separately based on the study design as cross-sectional (twenty four), case-control (fifteen) or cohort studies (seven). The cohort studies showed a three-fold high (confidence interval 1.66–6.29) occurrence of DM2 in individuals with HCV compared to those who were unexposed to HCV and cross sectional studies had a summary odds ratio of 2.53 (1.96, 3.25). In case control studies, the summary odds ratio for studies done in subjects with DM2 was 3.61 (1.93, 6.74); in HCV, it was 2.30 (1.56, 3.38); and all fifteen studies, together, yielded an odds ratio of 2.60 (1.82, 3.73). ^ Conclusion. The above results support the hypothesis that there is an association between DM and HCV. The temporal relationship evident from cohort studies and proposed pathogenic mechanisms also suggest that HCV predisposes patients to development of DM2. Further cohort or prospective studies are needed, however, to determine whether treatment of HCV infections prevents development of DM2.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The purpose of this study was to examine the association of perceived stress and passing the fitness test in a cohort of Department of Defense active duty members. Reports of this association have been suggested in numerous articles. Methods. The 2005 DoD Survey of Health Related Behaviors Among Active Duty Military Personnel was used to examine the association between the participants’ perceived levels of stress from family and/or work related sources and the respondents’ last required fitness test taking into account potential confounder of the association. Measures of association were obtained from logistic regression models. Results. Participants who experienced “some” or “a lot” of stress either from work sources (OR 0.69, 95% CI: 0.58-0.87) or from personal/family sources (OR 0.70, 95% CI: 0.57-0.86) were less likely to pass the fitness test when compared to their counterparts who experienced “none” or “a little” stress. Additionally, those who reported “some” or “a lot” of stress either from work sources (OR 0.54, 95% CI: 0.41-0.70) or from personal/family sources (OR 0.54, 95% CI: 0.44-0.67) that interfered with their military duties were also less likely to pass the fitness test. The multivariate adjustment only slightly reduced the unadjusted association. Conclusions . An association exists between perceived stress levels and outcome of fitness testing. The higher the level of stress perceived, the less likely the person will be to pass the fitness test. Stress-related intervention might be useful to help the military members to achieve the level of fitness needed to perform their duties.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genome-Wide Association Study analytical (GWAS) methods were applied in a large biracial sample of individuals to investigate variation across the genome for its association with a surrogate low-density lipoprotein (LDL) particle size phenotype, the ratio of LDL-cholesterol level over ApoB level. Genotyping was performed on the Affymetrix 6.0 GeneChip with approximately one million single nucleotide polymorphisms (SNPs). The ratio of LDL cholesterol to ApoB was calculated, and association tests used multivariable linear regression analysis with an additive genetic model after adjustment for the covariates sex, age and BMI. Association tests were performed separately in African Americans and Caucasians. There were 9,562 qualified individuals in the Caucasian group and 3,015 qualified individuals in the African American group. Overall, in Caucasians two statistically significant loci were identified as being associated with the ratio of LDL-cholesterol over ApoB: rs10488699 (p<5 x10-8, 11q23.3 near BUD13) and the SNP rs964184 (p<5 x10-8 11q23.3 near ZNF259). We also found rs12286037 ((p<4x10-7) (11q23.3) near APOA5/A4/C3/A1 with suggestive associate in the Caucasian sample. In exploratory analyses, a difference in the pattern of association between individuals taking and not taking LDL-cholesterol lowering medications was observed. Individuals who were not taking medications had smaller p-value than those taking medication. In the African-American group, there were no significant (p<5x10-8) or suggestive associations (p<4x10-7) with the ratio of LDL-cholesterol over ApoB after adjusting for age, BMI, and sex and comparing individuals with and without LDL-cholesterol lowering medication. Conclusions: There were significant and suggestive associations between SNP genotype and the ratio of LDL-cholesterol to ApoB in Caucasians, but these associations may be modified by medication treatment.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The determination of size as well as power of a test is a vital part of a Clinical Trial Design. This research focuses on the simulation of clinical trial data with time-to-event as the primary outcome. It investigates the impact of different recruitment patterns, and time dependent hazard structures on size and power of the log-rank test. A non-homogeneous Poisson process is used to simulate entry times according to the different accrual patterns. A Weibull distribution is employed to simulate survival times according to the different hazard structures. The current study utilizes simulation methods to evaluate the effect of different recruitment patterns on size and power estimates of the log-rank test. The size of the log-rank test is estimated by simulating survival times with identical hazard rates between the treatment and the control arm of the study resulting in a hazard ratio of one. Powers of the log-rank test at specific values of hazard ratio (≠1) are estimated by simulating survival times with different, but proportional hazard rates for the two arms of the study. Different shapes (constant, decreasing, or increasing) of the hazard function of the Weibull distribution are also considered to assess the effect of hazard structure on the size and power of the log-rank test. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The genomic era brought by recent advances in the next-generation sequencing technology makes the genome-wide scans of natural selection a reality. Currently, almost all the statistical tests and analytical methods for identifying genes under selection was performed on the individual gene basis. Although these methods have the power of identifying gene subject to strong selection, they have limited power in discovering genes targeted by moderate or weak selection forces, which are crucial for understanding the molecular mechanisms of complex phenotypes and diseases. Recent availability and rapid completeness of many gene network and protein-protein interaction databases accompanying the genomic era open the avenues of exploring the possibility of enhancing the power of discovering genes under natural selection. The aim of the thesis is to explore and develop normal mixture model based methods for leveraging gene network information to enhance the power of natural selection target gene discovery. The results show that the developed statistical method, which combines the posterior log odds of the standard normal mixture model and the Guilt-By-Association score of the gene network in a naïve Bayes framework, has the power to discover moderate/weak selection gene which bridges the genes under strong selection and it helps our understanding the biology under complex diseases and related natural selection phenotypes.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The central problem of complex inheritance is to map oligogenes for disease susceptibility, integrating linkage and association over samples that differ in several ways. Combination of evidence over multiple samples with 1,037 families supports loci contributing to asthma susceptibility in the cytokine region on 5q [maximum logarithm of odds (lod) = 2.61 near IL-4], but no evidence for atopy. The principal problems with retrospective collaboration on linkage appear to have been solved, providing far more information than a single study. A multipoint lod table evaluated at commonly agreed reference loci is required for both collaboration and metaanalysis, but variations in ascertainment, pedigree structure, phenotype definition, and marker selection are tolerated. These methods are invariant with statistical methods that increase the power of lods and are applicable to all diseases, motivating collaboration rather than competition. In contrast to linkage, positional cloning by allelic association has yet to be extended to multiple samples, a prerequisite for efficient combination with linkage and the greatest current challenge to genetic epidemiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent measurements of sedimentation equilibrium and sedimentation velocity have shown that the bacterial cell division protein FtsZ self-associates to form indefinitely long rod-like linear aggregates in the presence of GDP and Mg2+. In the present study, the newly developed technique of non-ideal tracer sedimentation equilibrium was used to measure the effect of high concentrations—up to 150 g/liter—of each of two inert “crowder” proteins, cyanmethemoglobin or BSA, on the thermodynamic activity and state of association of dilute FtsZ under conditions inhibiting (−Mg2+) and promoting (+Mg2+) FtsZ self-association. Analysis of equilibrium gradients of both FtsZ and crowder proteins indicates that, under the conditions of the present experiment, FtsZ interacts with each of the two crowder proteins essentially entirely via steric repulsion, which may be accounted for quantitatively by a simple model in which hemoglobin, albumin, and monomeric FtsZ are modeled as effective spherical hard particles, and each oligomeric species of FtsZ is modeled as an effective hard spherocylinder. The functional dependence of the sedimentation of FtsZ on the concentrations of FtsZ and either crowder indicates that, in the presence of high concentrations of crowder, both the weight-average degree of FtsZ self-association and the range of FtsZ oligomer sizes present in significant abundance are increased substantially.