974 resultados para variance-ratio tests


Relevância:

30.00% 30.00%

Publicador:

Resumo:

mgof computes goodness-of-fit tests for the distribution of a discrete (categorical, multinomial) variable. The default is to perform classical large sample chi-squared approximation tests based on Pearson's X2 statistic and the log likelihood ratio (G2) statistic or a statistic from the Cressie-Read family. Alternatively, mgof computes exact tests using Monte Carlo methods or exhaustive enumeration. A Kolmogorov-Smirnov test for discrete data is also provided. The moremata package, also available from SSC, is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new Stata command called -mgof- is introduced. The command is used to compute distributional tests for discrete (categorical, multinomial) variables. Apart from classic large sample $\chi^2$-approximation tests based on Pearson's $X^2$, the likelihood ratio, or any other statistic from the power-divergence family (Cressie and Read 1984), large sample tests for complex survey designs and exact tests for small samples are supported. The complex survey correction is based on the approach by Rao and Scott (1981) and parallels the survey design correction used for independence tests in -svy:tabulate-. The exact tests are computed using Monte Carlo methods or exhaustive enumeration. An exact Kolmogorov-Smirnov test for discrete data is also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Targeting hard-to-reach/marginalized populations is essential for preventing HIV-transmission. A unique opportunity to identify such populations in Switzerland is provided by a database of all genotypic-resistance-tests from Switzerland, including both sequences from the Swiss HIV Cohort Study (SHCS) and non-cohort sequences. A phylogenetic tree was built using 11,127 SHCS and 2,875 Swiss non-SHCS sequences. Demographics were imputed for non-SHCS patients using a phylogenetic proximity approach. Factors associated with non-cohort outbreaks were determined using logistic regression. Non-B subtype (univariable odds-ratio (OR): 1.9; 95% confidence interval (CI): 1.8-2.1), female gender (OR: 1.6; 95% CI: 1.4-1.7), black ethnicity (OR: 1.9; 95% CI: 1.7-2.1) and heterosexual transmission group (OR:1.8; 95% CI: 1.6-2.0), were all associated with underrepresentation in the SHCS. We found 344 purely non-SHCS transmission clusters, however, these outbreaks were small (median 2, maximum 7 patients) with a strong overlap with the SHCS'. 65% of non-SHCS sequences were part of clusters composed of >= 50% SHCS sequences. Our data suggests that marginalized-populations are underrepresented in the SHCS. However, the limited size of outbreaks among non-SHCS patients in-care implies that no major HIV outbreak in Switzerland was missed by the SHCS surveillance. This study demonstrates the potential of sequence data to assess and extend the scope of infectious-disease surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New osmium (Os) isotope and platinum group element (PGE) concentration data are used in conjunction with published 3He and Th isotope data to determine the relative proportions of lithogenic, extraterrestrial and hydrogenous iridium (Ir) in a Pacific pelagic carbonate sequence from the Ocean Drilling Program (ODP) Site 806 on the Ontong Java Plateau (OJP). These calculations demonstrate that lithogenic and extraterrestrial contributions to sedimentary Ir budget are minor, while hydrogenous Ir accounts for roughly 85% of the total Ir. Application of analogous partitioning calculations to previously reported data from a North Pacific red clay sequence (LL44-GPC3) yields very similar results. Total Ir burial fluxes at Site 806 and LL44-GPC3 are also similar, 45 and 30 pg/cm**2/kyr, respectively. Average Ir/3He and Ir/xs230Th_initial ratios calculated from the entire Site 806 data set are similar to those reported earlier for Pacific sites. In general, down-core profiles of Ir, 3He and xs230Th_initial, are not well correlated with one another. However, all three data sets show similar variance and yield sediment mass accumulation rate estimates that agree within a factor of two. While these results indicate that Ir concentration has potential as a point-paleoflux tracer in pelagic carbonates, Ir-based paleoflux estimates are likely subject to uncertainties that are similar to those associated with Co-based paleoflux estimates. Consequently, local calibration of Ir flux in space and time will be required to fully assess the potential of Ir as a point paleoflux tracer. Measured 187Os/188Os of the OJP sediments are systematically lower than the inferred 187Os/188Os of contemporaneous seawater and a clear glacial-interglacial 187Os/188Os variation is lacking. Mixing calculations suggest Os contributions from lithogenic sources are insufficient to explain the observed 187Os/188Os variations. The difference between the 187Os/188Os of bulk sediment and that of seawater is interpreted in terms of subtle contributions of unradiogenic Os carried by particulate extraterrestrial material. Down-core variations of 187Os/188Os with Pt/Ir and Os/Ir also point to contributions from extraterrestrial particles. Mixing calculations for each set of several triplicate analyses suggest that the unradiogenic Os end member cannot be characterized by primary extraterrestrial particles of chondritic composition. It is noteworthy that in efforts aimed at determining the effect of extraterrestrial contributions, 187Os/188Os of pelagic carbonates has greater potential compared to abundances of PGE. An attempt has been made for the first time to estimate sediment mass accumulation rates based on amount of extraterrestrial Os in the OJP samples and previously reported extraterrestrial Os flux. Throughout most of the OJP record, Os isotope-based paleoflux estimates are within a factor of two of those derived using other constant flux tracers. Meaningful flux estimates cannot be made during glacial maxima because the OJP sediments do not record the low 187Os/188Os reported previously. We speculate that this discrepancy may be related to focusing of extraterrestrial particles at the OJP, as has been suggested to explain down-core 3He variations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prototype scale tests of the mooring load and wave transmission characteristics of a floating tire breakwater were conducted in the large wave tank at the Coastal Engineering Research Center. Standard Goodyear Tire and Rubber Co. 18-tire modules connected to form breakwaters, 4 and 6 modules (8.5 and 12.8 meters, 28 and 42 feet) wide in the direction of wave advance, were tested in water depths of 2 and 4 meters (6.56 and 13.12 feet). Monochromatic waves with a 2.64- to 8.25-second period range and heights up to 1.4 meters (4.6 feet) were used in the tests. Test results indicate that wave transmission is mainly a function of the breakwater width to incident wavelength ratio with a slight dependence on the incident wave height. However, the mooring forces are mainly a function of the incident wave height with only a slight dependence on the incident wavelength and breakwater width. Recommended design curves for the wave transmission coefficient versus breakwater width to wavelength ratio and mooring load as a function of incident wave height are presented. (Author).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CD4-CD8 ratio is an important diagnostic measure of immune system functioning. In particular, CD4-CD8 ratio predicts the time taken for progression of HIV infection to acquired immune deficiency syndrome (AIDS) and the long-term survival of AIDS patients. To map genes that regulate differences between healthy individuals in CD4-CD8 ratio, we typed 757 highly polymorphic microsatellite markers at an average spacing of similar to5 cM across the genome in 405 pairs of dizygotic twins at ages 12, 14 and 16. We used multipoint variance components linkage analysis to test for linkage between marker loci and CD4-CD8 ratio at each age. We found suggestive evidence of linkage on chromosome 11p in 12-year-old twins (LOD=2.55, P=0.00031) and even stronger evidence of linkage in the same region at age 14 (LOD 3.51, P=0.00003). Possible candidate genes include CD5 and CD6, which encode cell membrane proteins involved in the positive selection of thymocytes. We also found suggestive evidence of linkage at other areas of the genome including regions on chromosomes 1, 3, 4, 5, 6, 12, 13, 15, 17 and 22.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we examined genetic and environmental influences on covariation among two reading tests used in neuropsychological assessment (Cambridge Contextual Reading Test [CCRT], [Beardsall, L., and Huppert, F. A. ( 1994). J. Clin. Exp. Neuropsychol. 16: 232 - 242], Schonell Graded Word Reading Test [SGWRT], [ Schonell, F. J., and Schonell, P. E. ( 1960). Diagnostic and attainment testing. Edinburgh: Oliver and Boyd.]) and among a selection of IQ subtests from the Multidimensional Aptitude Battery (MAB), [Jackson, D. N. (1984). Multidimensional aptitude battery, Ontario: Research Psychologists Press.] and the Wechsler Adult Intelligence Scale-Revised (WAIS-R) [Wechsler, D. (1981). Manual for the Wechsler Adult Intelligence Scale-Revised (WAIS-R). San Antonio: The Psychological Corporation]. Participants were 225 monozygotic and 275 dizygotic twin pairs aged from 15 years to 18 years ( mean, 16 years). For Verbal IQ subtests, phenotypic correlations with the reading tests ranged from 0.44 to 0.65. For Performance IQ subtests, phenotypic correlations with the reading tests ranged from 0.23 to 0.34. Results of Structural Equation Modeling (SEM) supported a model with one genetic General factor and three genetic group factors ( Verbal, Performance, Reading). Reading performance was influenced by the genetic General factor ( accounting for 13% and 20% of the variance for the CCRT and SGWRT, respectively), the genetic Verbal factor ( explaining 17% and 19% of variance for the CCRT and SGWRT), and the genetic Reading factor ( explaining 21% of the variance for both the CCRT and SGWRT). A common environment factor accounted for 25% and 14% of the CCRT and SGWRT variance, respectively. Genetic influences accounted for more than half of the phenotypic covariance between the reading tests and each of the IQ subtests. The heritabilities of the CCRT and SGWRT were 0.54 and 0.65, respectively. Observable covariance between reading assessments used by neuropsychologists to estimate IQ and IQ subtests appears to be largely due to genetic effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate whether the introduction of a national, co-ordinated screening program using the faecal occult blood test represents 'value-for-money' from the perspective of the Australian Government as third-party funder. Methods: The annual equivalent costs and consequences of a biennial screening program in 'steady-state' operation were estimated for the Australian population using 1996 as the reference year. Disability-adjusted life years (DALYs) and the years of life lost (YLLs) averted, and the health service costs were modelled, based on the epidemiology and the costs of colorectal cancer in Australia together with the mortality reduction achieved in randomised controlled trials. Uncertainty in the model was examined using Monte Carlo simulation methods. Results: We estimate a minimum or 'base program' of screening those aged 55 to 69 years could avert 250 deaths per annum (95% uncertainty interval 99-400), at a gross cost of $A55 million (95% UI $A46 million to $A96 million) and a gross incremental cost-effectiveness ratio of $A17,000/DALY (95% UI $A13,000/DALY to $A52,000/DALY). Extending the program to include 70 to 74-year-olds is a more effective option (cheaper and higher health gain) than including the 50 to 54-year-olds. Conclusions: The findings of this study support the case for a national program directed at the 55 to 69-year-old age group with extension to 70 to 74-year-olds if there are sufficient resources. The pilot tests recently announced in Australia provide an important opportunity to consider the age range for screening and the sources of uncertainty, identified in the modelled evaluation, to assist decisions on implementing a full national program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is aimed primarily at eye care practitioners who are undertaking advanced clinical research, and who wish to apply analysis of variance (ANOVA) to their data. ANOVA is a data analysis method of great utility and flexibility. This article describes why and how ANOVA was developed, the basic logic which underlies the method and the assumptions that the method makes for it to be validly applied to data from clinical experiments in optometry. The application of the method to the analysis of a simple data set is then described. In addition, the methods available for making planned comparisons between treatment means and for making post hoc tests are evaluated. The problem of determining the number of replicates or patients required in a given experimental situation is also discussed. Copyright (C) 2000 The College of Optometrists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To carry out an analysis of variance, several assumptions are made about the nature of the experimental data which have to be at least approximately true for the tests to be valid. One of the most important of these assumptions is that a measured quantity must be a parametric variable, i.e., a member of a normally distributed population. If the data are not normally distributed, then one method of approach is to transform the data to a different scale so that the new variable is more likely to be normally distributed. An alternative method, however, is to use a non-parametric analysis of variance. There are a limited number of such tests available but two useful tests are described in this Statnote, viz., the Kruskal-Wallis test and Friedmann’s analysis of variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In any investigation in optometry involving more that two treatment or patient groups, an investigator should be using ANOVA to analyse the results assuming that the data conform reasonably well to the assumptions of the analysis. Ideally, specific null hypotheses should be built into the experiment from the start so that the treatments variation can be partitioned to test these effects directly. If 'post-hoc' tests are used, then an experimenter should examine the degree of protection offered by the test against the possibilities of making either a type 1 or a type 2 error. All experimenters should be aware of the complexity of ANOVA. The present article describes only one common form of the analysis, viz., that which applies to a single classification of the treatments in a randomised design. There are many different forms of the analysis each of which is appropriate to the analysis of a specific experimental design. The uses of some of the most common forms of ANOVA in optometry have been described in a further article. If in any doubt, an investigator should consult a statistician with experience of the analysis of experiments in optometry since once embarked upon an experiment with an unsuitable design, there may be little that a statistician can do to help.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The key to the correct application of ANOVA is careful experimental design and matching the correct analysis to that design. The following points should therefore, be considered before designing any experiment: 1. In a single factor design, ensure that the factor is identified as a 'fixed' or 'random effect' factor. 2. In more complex designs, with more than one factor, there may be a mixture of fixed and random effect factors present, so ensure that each factor is clearly identified. 3. Where replicates can be grouped or blocked, the advantages of a randomised blocks design should be considered. There should be evidence, however, that blocking can sufficiently reduce the error variation to counter the loss of DF compared with a randomised design. 4. Where different treatments are applied sequentially to a patient, the advantages of a three-way design in which the different orders of the treatments are included as an 'effect' should be considered. 5. Combining different factors to make a more efficient experiment and to measure possible factor interactions should always be considered. 6. The effect of 'internal replication' should be taken into account in a factorial design in deciding the number of replications to be used. Where possible, each error term of the ANOVA should have at least 15 DF. 7. Consider carefully whether a particular factorial design can be considered to be a split-plot or a repeated measures design. If such a design is appropriate, consider how to continue the analysis bearing in mind the problem of using post hoc tests in this situation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate the correlation between tests of visual function and perceived visual ability recorded with a 'quality-of-life' questionnaire for patients with central field loss. Method: 12 females and 7 males (mean age = 53.1 years; Range = 23 - 80 years) with subfoveal neovascular membranes underwent a comprehensive assessment of visual function. Tests included unaided distance vision, high and low contrast distance logMAR visual acuity (VA), Pelli-Robson contrast senstivity (at 1m), near logMAR word VA and text reading speed. All tests were done both monocularly and binocularly. The patients also completed a 28 point questionnaire separated into a 'core' section consisting of general questions about perceived visual function and a 'module' section with specific questions on reading function. Results: Step-wise multiple regression analysis was used to determine which visual function tests were correlated with the patients's perceived visual function and to rank them in order of importance. The visual function test that explains most of the variance in both 'core' score (66%0 and the 'module' score (68%) of the questionnaire is low contrast VA in the better eye (P<0.001 in both cases). Further, the module score also accounts for a significant proportion of the variance (P<0.01) of the distance logMAR VA in both the better and worse eye, and the near logMAR in both the better eye and binocularly. Conclusions: The best predictor of both perceived reading ability and of general perceived visual ability in this study is low contrast logMAR VA. The results highlight that distance VA is not the only relevant measure of visual fucntion in relation to a patients's perceived visual performance and should not be considered a determinant of surgical or management success.