536 resultados para Collectionwise Normality
Resumo:
Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^
Resumo:
The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^
Resumo:
Several tests for the comparison of different groups in the randomized complete block design exist. However, there is a lack of robust estimators for the location difference between one group and all the others on the original scale. The relative marginal effects are commonly used in this situation, but they are more difficult to interpret and use by less experienced people because of the different scale. In this paper two nonparametric estimators for the comparison of one group against the others in the randomized complete block design will be presented. Theoretical results such as asymptotic normality, consistency, translation invariance, scale preservation, unbiasedness, and median unbiasedness are derived. The finite sample behavior of these estimators is derived by simulations of different scenarios. In addition, possible confidence intervals with these estimators are discussed and their behavior derived also by simulations.
Resumo:
Introduction: According to the ecological view, coordination establishes byvirtueof social context. Affordances thought of as situational opportunities to interact are assumed to represent the guiding principles underlying decisions involved in interpersonal coordination. It’s generally agreed that affordances are not an objective part of the (social) environment but that they depend on the constructive perception of involved subjects. Theory and empirical data hold that cognitive operations enabling domain-specific efficacy beliefs are involved in the perception of affordances. The aim of the present study was to test the effects of these cognitive concepts in the subjective construction of local affordances and their influence on decision making in football. Methods: 71 football players (M = 24.3 years, SD = 3.3, 21 % women) from different divisions participated in the study. Participants were presented scenarios of offensive game situations. They were asked to take the perspective of the person on the ball and to indicate where they would pass the ball from within each situation. The participants stated their decisions in two conditions with different game score (1:0 vs. 0:1). The playing fields of all scenarios were then divided into ten zones. For each zone, participants were asked to rate their confidence in being able to pass the ball there (self-efficacy), the likelihood of the group staying in ball possession if the ball were passed into the zone (group-efficacy I), the likelihood of the ball being covered safely by a team member (pass control / group-efficacy II), and whether a pass would establish a better initial position to attack the opponents’ goal (offensive convenience). Answers were reported on visual analog scales ranging from 1 to 10. Data were analyzed specifying general linear models for binomially distributed data (Mplus). Maximum likelihood with non-normality robust standard errors was chosen to estimate parameters. Results: Analyses showed that zone- and domain-specific efficacy beliefs significantly affected passing decisions. Because of collinearity with self-efficacy and group-efficacy I, group-efficacy II was excluded from the models to ease interpretation of the results. Generally, zones with high values in the subjective ratings had a higher probability to be chosen as passing destination (βself-efficacy = 0.133, p < .001, OR = 1.142; βgroup-efficacy I = 0.128, p < .001, OR = 1.137; βoffensive convenience = 0.057, p < .01, OR = 1.059). There were, however, characteristic differences in the two score conditions. While group-efficacy I was the only significant predictor in condition 1 (βgroup-efficacy I = 0.379, p < .001), only self-efficacy and offensive convenience contributed to passing decisions in condition 2 (βself-efficacy = 0.135, p < .01; βoffensive convenience = 0.120, p < .001). Discussion: The results indicate that subjectively distinct attributes projected to playfield zones affect passing decisions. The study proposes a probabilistic alternative to Lewin’s (1951) hodological and deterministic field theory and enables insight into how dimensions of the psychological landscape afford passing behavior. Being part of a team, this psychological landscape is not only constituted by probabilities that refer to the potential and consequences of individual behavior, but also to that of the group system of which individuals are part of. Hence, in regulating action decisions in group settings, informers are extended to aspects referring to the group-level. References: Lewin, K. (1951). In D. Cartwright (Ed.), Field theory in social sciences: Selected theoretical papers by Kurt Lewin. New York: Harper & Brothers.
Resumo:
BACKGROUND The noble gas xenon is considered as a neuroprotective agent, but availability of the gas is limited. Studies on neuroprotection with the abundant noble gases helium and argon demonstrated mixed results, and data regarding neuroprotection after cardiac arrest are scant. We tested the hypothesis that administration of 50% helium or 50% argon for 24 h after resuscitation from cardiac arrest improves clinical and histological outcome in our 8 min rat cardiac arrest model. METHODS Forty animals had cardiac arrest induced with intravenous potassium/esmolol and were randomized to post-resuscitation ventilation with either helium/oxygen, argon/oxygen or air/oxygen for 24 h. Eight additional animals without cardiac arrest served as reference, these animals were not randomized and not included into the statistical analysis. Primary outcome was assessment of neuronal damage in histology of the region I of hippocampus proper (CA1) from those animals surviving until day 5. Secondary outcome was evaluation of neurobehavior by daily testing of a Neurodeficit Score (NDS), the Tape Removal Test (TRT), a simple vertical pole test (VPT) and the Open Field Test (OFT). Because of the non-parametric distribution of the data, the histological assessments were compared with the Kruskal-Wallis test. Treatment effect in repeated measured assessments was estimated with a linear regression with clustered robust standard errors (SE), where normality is less important. RESULTS Twenty-nine out of 40 rats survived until day 5 with significant initial deficits in neurobehavioral, but rapid improvement within all groups randomized to cardiac arrest. There were no statistical significant differences between groups neither in the histological nor in neurobehavioral assessment. CONCLUSIONS The replacement of air with either helium or argon in a 50:50 air/oxygen mixture for 24 h did not improve histological or clinical outcome in rats subjected to 8 min of cardiac arrest.
Resumo:
Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^
Resumo:
In epidemiology literature, it is often required to investigate the relationships between means where the levels of experiment are actually monotone sets forming a partition on the range of sampling values. With this need, the analysis of these group means is generally performed using classical analysis of variance (ANOVA). However, this method has never been challenged. In this dissertation, we will formulate and present our examination of its validity. First, the classical assumptions of normality and constant variance are not always true. Second, under the null hypothesis of equal means, the test statistic for the classical ANOVA technique is still valid. Third, when the hypothesis of equal means is rejected, the classical analysis techniques for hypotheses of contrasts are not valid. Fourth, under the alternative hypothesis, we can show that the monotone property of levels leads to the conclusion that the means are monotone. Fifth, we propose an appropriate method for handing the data in this situation. ^
Resumo:
Triglyceride levels are a component of plasma lipids that are thought to be an important risk factor for coronary heart disease and are influenced by genetic and environmental factors, such as single nucleotide polymorphisms (SNPs), alcohol intake, and smoking. This study used longitudinal data from the Bogalusa Heart Study, a biracial community-based survey of cardiovascular disease risk factors. A sample of 1191 individuals, 4 to 38 years of age, was measured multiple times from 1973 to 2000. The study sample consisted of 730 white and 461 African American participants. Individual growth models were developed in order to assess gene-environment interactions affecting plasma triglycerides over time. After testing for inclusion of significant covariates and interactions, final models, each accounting for the effects of a different SNP, were assessed for fit and normality. After adjustment for all other covariates and interactions, LIPC -514C/T was found to interact with age3, age2, and age and a non-significant interaction of CETP -971G/A genotype with smoking status was found (p = 0.0812). Ever-smokers had higher triglyceride levels than never smokers, but persons heterozygous at this locus, about half of both races, had higher triglyceride levels after smoking cessation compared to current smokers. Since tobacco products increase free fatty acids circulating in the bloodstream, smoking cessation programs have the potential to ultimately reduce triglyceride levels for many persons. However, due to the effect of smoking cessation on the triglyceride levels of CETP -971G/A heterozygotes, the need for smoking prevention programs is also demonstrated. Both smoking cessation and prevention programs would have a great public health impact on minimizing triglyceride levels and ultimately reducing heart disease. ^
Resumo:
Background. Research into methods for recovery from fatigue due to exercise is a popular topic among sport medicine, kinesiology and physical therapy. However, both the quantity and quality of studies and a clear solution of recovery are lacking. An analysis of the statistical methods in the existing literature of performance recovery can enhance the quality of research and provide some guidance for future studies. Methods: A literature review was performed using SCOPUS, SPORTDiscus, MEDLINE, CINAHL, Cochrane Library and Science Citation Index Expanded databases to extract the studies related to performance recovery from exercise of human beings. Original studies and their statistical analysis for recovery methods including Active Recovery, Cryotherapy/Contrast Therapy, Massage Therapy, Diet/Ergogenics, and Rehydration were examined. Results: The review produces a Research Design and Statistical Method Analysis Summary. Conclusion: Research design and statistical methods can be improved by using the guideline from the Research Design and Statistical Method Analysis Summary. This summary table lists the potential issues and suggested solutions, such as, sample size calculation, sports specific and research design issues consideration, population and measure markers selection, statistical methods for different analytical requirements, equality of variance and normality of data, post hoc analyses and effect size calculation.^
Resumo:
Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^
Resumo:
The history of the logistic function since its introduction in 1838 is reviewed, and the logistic model for a polychotomous response variable is presented with a discussion of the assumptions involved in its derivation and use. Following this, the maximum likelihood estimators for the model parameters are derived along with a Newton-Raphson iterative procedure for evaluation. A rigorous mathematical derivation of the limiting distribution of the maximum likelihood estimators is then presented using a characteristic function approach. An appendix with theorems on the asymptotic normality of sample sums when the observations are not identically distributed, with proofs, supports the presentation on asymptotic properties of the maximum likelihood estimators. Finally, two applications of the model are presented using data from the Hypertension Detection and Follow-up Program, a prospective, population-based, randomized trial of treatment for hypertension. The first application compares the risk of five-year mortality from cardiovascular causes with that from noncardiovascular causes; the second application compares risk factors for fatal or nonfatal coronary heart disease with those for fatal or nonfatal stroke. ^
Resumo:
El río Mendoza conforma el oasis norte que es el más importante de la provincia. El crecimiento urbano ha avanzado sobre áreas originalmente agrícolas, rodeando la red de canales y desagües, que también recibe los desagües pluviales urbanos, producto de tormentas convectivas. La actividad antropogénica utiliza el recurso para bebida, saneamiento, riego, recreación, etc., y vuelca sus excedentes a la red, contaminándola. Para conocer la calidad del agua de esta cuenca se seleccionaron, estratégicamente, 15 sitios de muestreo: 3 a lo largo del río y a partir del dique derivador Cipolletti (R_I a R_III), 5 en la red de canales (C_I a C_V) y 7 ubicados en los colectores de drenaje (D_I a D_VII). Se realizaron los siguientes análisis físico-químicos y microbiológicos; en el río y en la red de canales: conductividad eléctrica, temperatura, pH, aniones y cationes (cálculo de RAS), oxígeno disuelto (OD), sólidos sedimentables, demanda química de oxígeno (DQO), bacterias aerobias mesófilas (BAM), coliformes totales y fecales y metales pesados. En la red de drenaje sólo se realizaron los cuatro primeros. Los resultados de los análisis, se incorporaron a una base de datos y se sometieron a un análisis estadístico descriptivo e inferencial. Este último consistió en la aplicación de diversas pruebas en busca de posibles diferencias entre los sitios de muestreo, para cada variable respuesta, a un α = 0.05. Se realizó el análisis de la varianza de efectos fijos y de efectos aleatorios y se probaron los supuestos de homocedasticidad y de normalidad de los errores. En el caso de violación de los supuestos, se utilizó la prueba de Kruskal- Wallis. Se compararon los siguientes sitios de muestreo entre sí: ríos, R_I-canales y drenajes. Se concluyó que hay un aumento significativo de la salinidad y la sodicidad en R_II, que los cambios de calidad ocurridos entre R_II y R_III podrían deberse al aporte de otras aguas. Con respecto a la comparación de los parámetros entre la cabeza del sistema (R_I) y la red de canales se puede decir que los aportes realizados por los escurrimientos urbanos ubicados hacia el oeste del canal Cacique Guaymallén, sumados a los vuelcos de Campo Espejo (detectados en C_II), incrementan significativamente la salinidad (+55 %) y sodicidad del agua (+95 %) respecto del punto R_I, aunque el valor de sodicidad sigue siendo bajo. También se han encontrado incrementos de salinidad (+80 %), de DQO (+1159 %) y BAM (+2873 %) con lógica disminución de OD (-58 %) en el punto C_V (canal Auxiliar Tulumaya) respecto del punto R_I, ocasionados por aportes urbanos (Gran Mendoza) sumados a la carga contaminante del canal Pescara. Los metales pesados no presentan grandes diferencias entre sitios de muestreo.
Resumo:
El objetivo de este trabajo es investigar la historia de la monstruosidad femenina en la literatura griega antigua para recuperar algunas estructuras arquetípicas de pensamiento que se ocupan de la antigua y moderna conciencia colectiva sobre el problema del mal, su naturaleza, sus razones y también su falta de razones. De este modo yo repasé las "vidas paralelas" de tres famosas mujeres fatales de la mitología clásica que, colocadas en puntos decisivos de árboles genealógicos horribles llenas de 'maldiciones genéticas', son capaces de formar un tríptico bien definido de "medallones" enmarcados por un fil rouge de la monstruosidad ininterrumpida. Paradigmático de la dialéctica ambigua entre hombre-mujer, bien-mal, víctima-verdugo, normalidad-desviación, y de las dinámicas incontrolables entre los crímenes y los castigos, miedos ancestrales y deseo de descubrimiento, demonios buenos y malos, los mitos de Lamia, Circe y Empusa destacan la atracción irracional que, en la cultura griega antigua, tan racionalista, las personificaciones femeninas del mal son imaginadas moviéndose, con el fin de influir en la conducta humana en las principales etapas de la vida
Resumo:
El objetivo de este trabajo es investigar la historia de la monstruosidad femenina en la literatura griega antigua para recuperar algunas estructuras arquetípicas de pensamiento que se ocupan de la antigua y moderna conciencia colectiva sobre el problema del mal, su naturaleza, sus razones y también su falta de razones. De este modo yo repasé las "vidas paralelas" de tres famosas mujeres fatales de la mitología clásica que, colocadas en puntos decisivos de árboles genealógicos horribles llenas de 'maldiciones genéticas', son capaces de formar un tríptico bien definido de "medallones" enmarcados por un fil rouge de la monstruosidad ininterrumpida. Paradigmático de la dialéctica ambigua entre hombre-mujer, bien-mal, víctima-verdugo, normalidad-desviación, y de las dinámicas incontrolables entre los crímenes y los castigos, miedos ancestrales y deseo de descubrimiento, demonios buenos y malos, los mitos de Lamia, Circe y Empusa destacan la atracción irracional que, en la cultura griega antigua, tan racionalista, las personificaciones femeninas del mal son imaginadas moviéndose, con el fin de influir en la conducta humana en las principales etapas de la vida
Resumo:
El objetivo de este trabajo es investigar la historia de la monstruosidad femenina en la literatura griega antigua para recuperar algunas estructuras arquetípicas de pensamiento que se ocupan de la antigua y moderna conciencia colectiva sobre el problema del mal, su naturaleza, sus razones y también su falta de razones. De este modo yo repasé las "vidas paralelas" de tres famosas mujeres fatales de la mitología clásica que, colocadas en puntos decisivos de árboles genealógicos horribles llenas de 'maldiciones genéticas', son capaces de formar un tríptico bien definido de "medallones" enmarcados por un fil rouge de la monstruosidad ininterrumpida. Paradigmático de la dialéctica ambigua entre hombre-mujer, bien-mal, víctima-verdugo, normalidad-desviación, y de las dinámicas incontrolables entre los crímenes y los castigos, miedos ancestrales y deseo de descubrimiento, demonios buenos y malos, los mitos de Lamia, Circe y Empusa destacan la atracción irracional que, en la cultura griega antigua, tan racionalista, las personificaciones femeninas del mal son imaginadas moviéndose, con el fin de influir en la conducta humana en las principales etapas de la vida