982 resultados para Proportional


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Female gender and low income are two markers for groups that have been historically disadvantaged within most societies. The study explores two research questions related to their political representation: (1) ‘Are parties biased towards the ideological preferences of male and rich citizens?’; and (2) ‘Does the proportionality of the electoral system moderate the degree of under-representation of women and poor citizens in the party system?’ A multilevel analysis of survey data from 24 parliamentary democracies indicates that there is some bias against those with low income and, at a much smaller rate, women. This has systemic consequences for the quality of representation, as the preferences of the complementary groups differ. The proportionality of the electoral system influences the degree of under-representation: specifically, larger district magnitudes help in closing the considerable gap between rich and poor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hierarchically clustered populations are often encountered in public health research, but the traditional methods used in analyzing this type of data are not always adequate. In the case of survival time data, more appropriate methods have only begun to surface in the last couple of decades. Such methods include multilevel statistical techniques which, although more complicated to implement than traditional methods, are more appropriate. ^ One population that is known to exhibit a hierarchical structure is that of patients who utilize the health care system of the Department of Veterans Affairs where patients are grouped not only by hospital, but also by geographic network (VISN). This project analyzes survival time data sets housed at the Houston Veterans Affairs Medical Center Research Department using two different Cox Proportional Hazards regression models, a traditional model and a multilevel model. VISNs that exhibit significantly higher or lower survival rates than the rest are identified separately for each model. ^ In this particular case, although there are differences in the results of the two models, it is not enough to warrant using the more complex multilevel technique. This is shown by the small estimates of variance associated with levels two and three in the multilevel Cox analysis. Much of the differences that are exhibited in identification of VISNs with high or low survival rates is attributable to computer hardware difficulties rather than to any significant improvements in the model. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard analyses of survival data involve the assumption that survival and censoring are independent. When censoring and survival are related, the phenomenon is known as informative censoring. This paper examines the effects of an informative censoring assumption on the hazard function and the estimated hazard ratio provided by the Cox model.^ The limiting factor in all analyses of informative censoring is the problem of non-identifiability. Non-identifiability implies that it is impossible to distinguish a situation in which censoring and death are independent from one in which there is dependence. However, it is possible that informative censoring occurs. Examination of the literature indicates how others have approached the problem and covers the relevant theoretical background.^ Three models are examined in detail. The first model uses conditionally independent marginal hazards to obtain the unconditional survival function and hazards. The second model is based on the Gumbel Type A method for combining independent marginal distributions into bivariate distributions using a dependency parameter. Finally, a formulation based on a compartmental model is presented and its results described. For the latter two approaches, the resulting hazard is used in the Cox model in a simulation study.^ The unconditional survival distribution formed from the first model involves dependency, but the crude hazard resulting from this unconditional distribution is identical to the marginal hazard, and inferences based on the hazard are valid. The hazard ratios formed from two distributions following the Gumbel Type A model are biased by a factor dependent on the amount of censoring in the two populations and the strength of the dependency of death and censoring in the two populations. The Cox model estimates this biased hazard ratio. In general, the hazard resulting from the compartmental model is not constant, even if the individual marginal hazards are constant, unless censoring is non-informative. The hazard ratio tends to a specific limit.^ Methods of evaluating situations in which informative censoring is present are described, and the relative utility of the three models examined is discussed. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation develops and explores the methodology for the use of cubic spline functions in assessing time-by-covariate interactions in Cox proportional hazards regression models. These interactions indicate violations of the proportional hazards assumption of the Cox model. Use of cubic spline functions allows for the investigation of the shape of a possible covariate time-dependence without having to specify a particular functional form. Cubic spline functions yield both a graphical method and a formal test for the proportional hazards assumption as well as a test of the nonlinearity of the time-by-covariate interaction. Five existing methods for assessing violations of the proportional hazards assumption are reviewed and applied along with cubic splines to three well known two-sample datasets. An additional dataset with three covariates is used to explore the use of cubic spline functions in a more general setting. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic and phenotypic instability are hallmarks of cancer cells, but their cause is not clear. The leading hypothesis suggests that a poorly defined gene mutation generates genetic instability and that some of many subsequent mutations then cause cancer. Here we investigate the hypothesis that genetic instability of cancer cells is caused by aneuploidy, an abnormal balance of chromosomes. Because symmetrical segregation of chromosomes depends on exactly two copies of mitosis genes, aneuploidy involving chromosomes with mitosis genes will destabilize the karyotype. The hypothesis predicts that the degree of genetic instability should be proportional to the degree of aneuploidy. Thus it should be difficult, if not impossible, to maintain the particular karyotype of a highly aneuploid cancer cell on clonal propagation. This prediction was confirmed with clonal cultures of chemically transformed, aneuploid Chinese hamster embryo cells. It was found that the higher the ploidy factor of a clone, the more unstable was its karyotype. The ploidy factor is the quotient of the modal chromosome number divided by the normal number of the species. Transformed Chinese hamster embryo cells with a ploidy factor of 1.7 were estimated to change their karyotype at a rate of about 3% per generation, compared with 1.8% for cells with a ploidy factor of 0.95. Because the background noise of karyotyping is relatively high, the cells with low ploidy factor may be more stable than our method suggests. The karyotype instability of human colon cancer cell lines, recently analyzed by Lengnauer et al. [Lengnauer, C., Kinzler, K. W. & Vogelstein, B. (1997) Nature (London) 386, 623–627], also corresponds exactly to their degree of aneuploidy. We conclude that aneuploidy is sufficient to explain genetic instability and the resulting karyotypic and phenotypic heterogeneity of cancer cells, independent of gene mutation. Because aneuploidy has also been proposed to cause cancer, our hypothesis offers a common, unique mechanism of altering and simultaneously destabilizing normal cellular phenotypes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pumpkin leaves grown under high light (500-700 micromol of photons m-2.s-1) were illuminated under photon flux densities ranging from 6.5 to 1500 micromol.m-2.s-1 in the presence of lincomycin, an inhibitor of chloroplast protein synthesis. The illumination at all light intensities caused photoinhibition, measured as a decrease in the ratio of variable to maximum fluorescence. Loss of photosystem II (PSII) electron transfer activity correlated with the decrease in the fluorescence ratio. The rate constant of photoinhibition, determined from first-order fits, was directly proportional to photon flux density at all light intensities studied. The fluorescence ratio did not decrease if the leaves were illuminated in low light in the absence of lincomycin or incubated in darkness in the presence of lincomycin. The constancy of the quantum yield of photoinhibition under different photon flux densities strongly suggests that photoinhibition in vivo occurs by one dominant mechanism under all light intensities. This mechanism probably is not the acceptor side mechanism characterized in the anaerobic case in vitro. Furthermore, there was an excellent correlation between the loss of PSII activity and the loss of the D1 protein from thylakoid membranes under low light. At low light, photoinhibition occurs so slowly that inactive PSII centers with the D1 protein waiting to be degraded do not accumulate. The kinetic agreement between D1 protein degradation and the inactivation of PSII indicates that the turnover of the D1 protein depends on photoinhibition under both low and high light.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this research is to identify aspects that support the development of prospective mathematics teachers’ professional noticing in a b-learning context. The study presented here investigates the extent to which prospective secondary mathematics teachers attend and interpret secondary school students’ proportional reasoning and decide how to respond. Results show that interactions in an on-line discussion improve prospective mathematics teachers’ ability to identify and interpret important aspects of secondary school students’ mathematical thinking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Voters try to avoid wasting their votes even in PR systems. In this paper we make a case that this type of strategic voting can be observed and predicted even in PR systems. Contrary to the literature we do not see weak institutional incentive structures as indicative of a hopeless endeavor for studying strategic voting. The crucial question for strategic voting is how institutional incentives constrain an individual’s decision-making process. Based on expected utility maximization we put forward a micro-logic of an individual’s expectation formation process driven by institutional and dispositional incentives. All well-known institutional incentives to vote strategically that get channelled through the district magnitude are moderated by dispositional factors in order to become relevant for voting decisions. Employing data from Finland – because of its electoral system a particularly hard testing ground - we find considerable evidence for observable implications of our theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"LADC 148"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Reprinted from the Physical review, n. s., vol. X, no. 5, November, 1917."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Work performed under Contract No. AT-30-2-Gen-16."