41 resultados para Multiple-regression Analysis
em DigitalCommons@The Texas Medical Center
Resumo:
The history of the logistic function since its introduction in 1838 is reviewed, and the logistic model for a polychotomous response variable is presented with a discussion of the assumptions involved in its derivation and use. Following this, the maximum likelihood estimators for the model parameters are derived along with a Newton-Raphson iterative procedure for evaluation. A rigorous mathematical derivation of the limiting distribution of the maximum likelihood estimators is then presented using a characteristic function approach. An appendix with theorems on the asymptotic normality of sample sums when the observations are not identically distributed, with proofs, supports the presentation on asymptotic properties of the maximum likelihood estimators. Finally, two applications of the model are presented using data from the Hypertension Detection and Follow-up Program, a prospective, population-based, randomized trial of treatment for hypertension. The first application compares the risk of five-year mortality from cardiovascular causes with that from noncardiovascular causes; the second application compares risk factors for fatal or nonfatal coronary heart disease with those for fatal or nonfatal stroke. ^
Resumo:
The relative influence of race, income, education, and Food Stamp Program participation/nonparticipation on the food and nutrient intake of 102 fecund women ages 18-45 years in a Florida urban clinic population was assessed using the technique of multiple regression analysis. Study subgroups were defined by race and Food Stamp Program participation status. Education was found to have the greatest influence on food and nutrient intake. Race was the next most influential factor followed in order by Food Stamp Program participation and income. The combined effect of the four independent variables explained no more than 19 percent of the variance for any of the food and nutrient intake variables. This would indicate that a more complex model of influences is needed if variations in food and nutrient intake are to be fully explained.^ A socioeconomic questionnaire was administered to investigate other factors of influence. The influence of the mother, frequency and type of restaurant dining, and perceptions of food intake and weight were found to be factors deserving further study.^ Dietary data were collected using the 24-hour recall and food frequency checklist. Descriptive dietary findings indicated that iron and calcium were nutrients where adequacy was of concern for all study subgroups. White Food Stamp Program participants had the greatest number of mean nutrient intake values falling below the 1980 Recommended Dietary Allowances (RDAs). When Food Stamp Program participants were contrasted to nonparticipants, mean intakes of six nutrients (kilocalories, calcium, iron, vitamin A, thiamin, and riboflavin) were below the 1980 RDA compared to five mean nutrient intakes (kilocalories, calcium, iron, thiamin and riboflavin) for the nonparticipants. Use of the Index of Nutritional Quality (INQ), however, revealed that the quality of the diet of Food Stamp Program participants per 1000 kilocalories was adequate with exception of calcium and iron. Intakes of these nutrients were also not adequate on a 1000 kilocalorie basis for the nonparticipant group. When mean nutrient intakes of the groups were compared using Student's t-test oleicacid intake was the only significant difference found. Being a nonparticipant in the Food Stamp Program was found to be associated with more frequent consumption of cookies, sweet rolls, doughnuts, and honey. The findings of this study contradict the negative image of the Food Stamp Program participant and emphasize the importance of education. ^
Resumo:
Traditional comparison of standardized mortality ratios (SMRs) can be misleading if the age-specific mortality ratios are not homogeneous. For this reason, a regression model has been developed which incorporates the mortality ratio as a function of age. This model is then applied to mortality data from an occupational cohort study. The nature of the occupational data necessitates the investigation of mortality ratios which increase with age. These occupational data are used primarily to illustrate and develop the statistical methodology.^ The age-specific mortality ratio (MR) for the covariates of interest can be written as MR(,ij...m) = ((mu)(,ij...m)/(theta)(,ij...m)) = r(.)exp (Z('')(,ij...m)(beta)) where (mu)(,ij...m) and (theta)(,ij...m) denote the force of mortality in the study and chosen standard populations in the ij...m('th) stratum, respectively, r is the intercept, Z(,ij...m) is the vector of covariables associated with the i('th) age interval, and (beta) is a vector of regression coefficients associated with these covariables. A Newton-Raphson iterative procedure has been used for determining the maximum likelihood estimates of the regression coefficients.^ This model provides a statistical method for a logical and easily interpretable explanation of an occupational cohort mortality experience. Since it gives a reasonable fit to the mortality data, it can also be concluded that the model is fairly realistic. The traditional statistical method for the analysis of occupational cohort mortality data is to present a summary index such as the SMR under the assumption of constant (homogeneous) age-specific mortality ratios. Since the mortality ratios for occupational groups usually increase with age, the homogeneity assumption of the age-specific mortality ratios is often untenable. The traditional method of comparing SMRs under the homogeneity assumption is a special case of this model, without age as a covariate.^ This model also provides a statistical technique to evaluate the relative risk between two SMRs or a dose-response relationship among several SMRs. The model presented has application in the medical, demographic and epidemiologic areas. The methods developed in this thesis are suitable for future analyses of mortality or morbidity data when the age-specific mortality/morbidity experience is a function of age or when there is an interaction effect between confounding variables needs to be evaluated. ^
Resumo:
Hepatitis B virus (HBV) is a significant cause of liver diseases and related complications worldwide. Both injecting and non-injecting drug users are at increased risk of contracting HBV infection. Scientific evidence suggests that drug users have subnormal response to HBV vaccination and the seroprotection rates are lower than that in the general population; potentially due to vaccine factors, host factors, or both. The purpose of this systematic review is to examine the rates of seroprotection following HBV vaccination in drug using populations and to conduct a meta-analysis to identify the factors associated with varying seroprotection rates. Seroprotection is defined as developing an anti-HBs antibody level of ≥ 10 mIU/ml after receiving the HBV vaccine. Original research articles were searched using online databases and reference lists of shortlisted articles. HBV vaccine intervention studies reporting seroprotection rates in drug users and published in English language during or after 1989 were eligible. Out of 235 citations reviewed, 11 studies were included in this review. The reported seroprotection rates ranged from 54.5 – 97.1%. Combination vaccine (HAV and HBV) (Risk ratio 12.91, 95% CI 2.98-55.86, p = 0.003), measurement of anti-HBs with microparticle immunoassay (Risk ratio 3.46, 95% CI 1.11-10.81, p = 0.035) and anti-HBs antibody measurement at 2 months after the last HBV vaccine dose (RR 4.11, 95% CI 1.55-10.89, p = 0.009) were significantly associated with higher seroprotection rates. Although statistically nonsignificant, the variables mean age>30 years, higher prevalence of anti-HBc antibody and anti-HIV antibody in the sample population, and current drug use (not in drug rehabilitation treatment) were strongly associated with decreased seroprotection rates. Proportion of injecting drug users, vaccine dose and accelerated vaccine schedule were not predictors of heterogeneity across studies. Studies examined in this review were significantly heterogeneous (Q = 180.850, p = 0.000) and factors identified should be considered when comparing immune response across studies. The combination vaccine showed promising results; however, its effectiveness compared to standard HBV vaccine needs to be examined systematically. Immune response in DUs can possibly be improved by the use of bivalent vaccines, booster doses, and improving vaccine completion rates through integrated public programs and incentives.^
Resumo:
Background. Accurate measurement of attitudes toward participation in cancer treatment trials (CTs) and cancer prevention trials (CPTs) across varied groups could assist health researchers and educators when addressing attitudinal barriers to participation in these trials. ^ Methods. The Attitudes toward Cancer Trials Scales (ACTS) instrument development was based on a conceptual model developed from research literature, clinical practice experience, and empirical testing of items with a sample of 312 respondents. The ACTS contains two scales, the Cancer Trials (CT) scale (4 components; 18 items) and the Cancer Prevention Trials (CPT) scale (3 components; 16 items). Cronbach's alpha values for the CT and CPT scales, respectively, were 0.86 and 0.89. These two scales along with sociodemographic and cancer trial history variables were distributed in a mail survey of former patients of a large cancer research center. The disproportionate stratified probability sampling procedure yielded 925 usable responses (54% response rate). ^ Results. Prevalence of favorable attitudes toward CTs and CPTs was 66% and 69%, respectively. There were no significant differences in mean scale scores by cancer site or gender, but African Americans had more favorable attitudes toward CTs than European Americans. Multiple regression analysis indicated that older age, lower education level, and prior CT participation history were associated with more favorable attitudes toward CTs. Prior CT participation and prior CPT participation were associated with more favorable attitudes toward CPTs. Results also provided evidence of reliability and construct validity for both scales. ^ Conclusions. Middle age, higher education, and European American ethnicity are associated with less positive attitudes about participating in cancer treatment trials. Availability of a psychometrically sound instrument to measure attitudes may facilitate a better understanding decision making regarding participation in CTs and CPTs. It is this author's intention that the ACTS' scales will be used by other investigators to measure attitudes toward CTs and CPTs in various groups of persons, and that the many issues regarding participation in trials might become more explicit. ^
Resumo:
Background. The increasing emphasis on medical outcomes and cost containment has made it imperative to identify patient populations in which aggressive nutritional care can improve quality of care. The aim of this prospective study was to implement a standardized early jejunal feeding protocol for patients undergoing small and large bowel resection, and to evaluate its effect on patient outcome and cost.^ Methods. Treatment patients (n = 81) who met protocol inclusion criteria had a jejunal feeding tube inserted at the time of surgery. Feeding was initiated at 10 cc/hour within 12 hours after bowel resection and progressed if hemodynamically stable. The control group (n = 159) received usual care. Outcome measures included postoperative length of stay, total direct cost, nosocomial infection rate and health status (SF-36) scores.^ Results. By postoperative day 4, the use of total parenteral nutrition (TPN) was significantly greater in the control group compared to the treatment group; however, total nutritional intake was significantly less. Multiple regression analysis indicated an increased likelihood of infection with the use of TPN. A reduction of 3.5 postoperative days (p =.013) with 4.3 fewer TPN days per patient (p =.001) and a 9.6% reduction in infection rate (p =.042) was demonstrated in the treatment group. There was no difference in health status scores between groups at discharge and 3 months post-discharge.^ Conclusion. These positive outcomes and an average total cost savings of $4,145 per treatment patient indicate that the treatment protocol was effective. ^
Resumo:
Objective. This study examines post-crisis family stress, coping, communication, and adaptation using the Double ABC-X Model of Family Adaptation in families with a pregnant or postpartum adolescent living at home. ^ Methods. Ninety-eight pregnant and parenting adolescents between ages 14 and 18 years (Group 1 at 20 or more weeks gestation; Group 2 at delivery and 8 weeks postpartum) and their parent(s) completed instruments congruent with the model to measure family stress, coping, communication, and adaptation. Descriptive family data was obtained. Mother-daughter data was analyzed for differences between subjects and within subjects using paired t-tests. Correlational analysis was used to examine relationships among variables. ^ Results. More than 90% of families were Hispanic. There were no significant differences between mother and daughter mean scores for family stress or communication. Adolescent coping was not significantly correlated to family coping at any interval. Adolescent family adaptation scores were significantly lower than mothers' scores at delivery and 8 weeks postpartum. Mean individual ratings of family variables did not differ significantly between delivery and 8 weeks postpartum. Simultaneous multiple regression analysis showed that stress, coping, and communication significantly influenced adaptation for mothers and daughters at all three intervals. The relative contributions of the three independent variables exhibited different patterns for mothers and daughters. Parent-adolescent communication accounted for most of the variability in adaptation for daughters at all three intervals. Daughters' family stress ratings were significant for adaptability (p = .01) during the pregnancy and for cohesion (p = .03) at delivery. Adolescent coping (p = .03) was significant for cohesion at 8 weeks postpartum. Family stress was a significant influence at all three intervals for mothers' ratings of family adaptation. Parent-adolescent communication was significant for mother's perception of both family cohesion (p < .001) and adaptability (p < .001) at delivery and 8 weeks, but not during pregnancy. ^ Conclusions. Mothers' and daughters' ratings of family processes were similar regarding family stress and communication, but were significantly different for family adaptation. Adolescent coping may not reflect family coping. Family communication is a powerful component in family functioning and may be an important focus for interventions with adolescents and parents. ^
Resumo:
Background/significance. The scarcity of reliable and valid Spanish language instruments for health related research has hindered research with the Hispanic population. Research suggests that fatalistic attitudes are related to poor cancer screening behaviors and may be one reason for low participation of Mexican-Americans in cancer screening. This problem is of major concern because Mexican-Americans constitute the largest Hispanic subgroup in the U.S.^ Purpose. The purposes of this study were: (1) To translate the Powe Fatalism Inventory, (PFI) into Spanish, and culturally adapt the instrument to the Mexican-American culture as found along the U.S.-Mexico border and (2) To test the equivalence between the Spanish translated, culturally adapted version of the PFI and the English version of the PFI to include clarity, content validity, reading level and reliability.^ Design. Descriptive, cross-sectional.^ Methods. The Spanish language translation used a translation model which incorporates a cultural adaptation process. The SPFI was administered to 175 bilingual participants residing in a midsize, U.S-Mexico border city. Data analysis included estimation of Cronbach's alpha, factor analysis, paired samples t-test comparison and multiple regression analysis using SPSS software, as well as measurement of content validity and reading level of the SPFI. ^ Findings. A reliability estimate using Cronbach's alpha coefficient was 0.81 for the SPFI compared to 0.80 for the PFI in this study. Factor Analysis extracted four factors which explained 59% of the variance. Paired t-test comparison revealed no statistically significant differences between the SPFI and PFI total or individual item scores. Content Validity Index was determined to be 1.0. Reading Level was assessed to be less than a 6th grade reading level. The correlation coefficient between the SPFI and PFI was 0.95.^ Conclusions. This study provided strong psychometric evidence that the Spanish translated, culturally adapted SPFI is an equivalent tool to the English version of the PFI in measuring cancer fatalism. This indicates that the two forms of the instrument can be used interchangeably in a single study to accommodate reading and speaking abilities of respondents. ^
Resumo:
The main objective of this study was to attempt to develop some indicators for measuring the food safety status of a country. A conceptual model was put forth by the investigator. The assumption was that food safety status was multifactorily influenced by medico-health levels, food-nutrition programs, and consumer protection activities. However, all these in turn depended upon socio-economic status of the country.^ Twenty-six indicators were reviewed and examined. Seventeen were first screened and three were finally selected, by the stepwise multiple regression analysis, to reflect the food safety status. Sixty-one countries/areas were included in this study.^ The three indicators were life expectancy at birth with multiple correlation coefficient (R2 = 34.62%), adult literacy rate (R2 = 29.66%), and child mortality rate for ages 1-4 (R2 = 9.99%). They showed a cumulative R2 of 57.79%. ^
Resumo:
The purpose of this study was to assess the impact of the Arkansas Long-Term Care Demonstration Project upon Arkansas' Medicaid expenditures and upon the clients it serves. A Retrospective Medicaid expenditure study component used analyses of variance techniques to test for the Project's effects upon aggregated expenditures for 28 demonstration and control counties representing 25 percent of the State's population over four years, 1979-1982.^ A second approach to the study question utilized a 1982 prospective sample of 458 demonstration and control clients from the same 28 counties. The disability level or need for care of each patient was established a priori. The extent to which an individual's variation in Medicaid utilization and costs was explained by patient need, presence or absence of the channeling project's placement decision or some other patient characteristic was examined by multiple regression analysis. Long-term and acute care Medicaid, Medicare, third party, self-pay and the grand total of all Medicaid claims were analyzed for project effects and explanatory relationships.^ The main project effect was to increase personal care costs without reducing nursing home or acute care costs (Prospective Study). Expansion of clients appeared to occur in personal care (Prospective Study) and minimum care nursing home (Retrospective Study) for the project areas. Cost-shifting between Medicaid and Medicare in the project areas and two different patterns of utilization in the North and South projects tended to offset each other such that no differences in total costs between the project areas and demonstration areas occurred. The project was significant ((beta) = .22, p < .001) only for personal care costs. The explanatory power of this personal care regression model (R('2) = .36) was comparable to other reported health services utilization models. Other variables (Medicare buy-in, level of disability, Social Security Supplemental Income (SSI), net monthly income, North/South areas and age) explained more variation in the other twelve cost regression models. ^
Resumo:
The National Health Planning and Resources Development Act of 1974 (Public Law 93-641) requires that health systems agencies (HSAs) plan for their health service areas by the use of existing data to the maximum extent practicable. Health planning is based on the identificaton of health needs; however, HSAs are, at present, identifying health needs in their service areas in some approximate terms. This lack of specificity has greatly reduced the effectiveness of health planning. The intent of this study is, therefore, to explore the feasibility of predicting community levels of hospitalized morbidity by diagnosis by the use of existing data so as to allow health planners to plan for the services associated with specific diagnoses.^ The specific objectives of this study are (a) to obtain by means of multiple regression analysis a prediction equation for hospital admission by diagnosis, i.e., select the variables that are related to demand for hospital admissions; (b) to examine how pertinent the variables selected are; and (c) to see if each equation obtained predicts well for health service areas.^ The existing data on hospital admissions by diagnosis are those collected from the National Hospital Discharge Surveys, and are available in a form aggregated to the nine census divisions. When the equations established with such data are applied to local health service areas for prediction, the application is subject to the criticism of the theory of ecological fallacy. Since HSAs have to rely on the availability of existing data, it is imperative to examine whether or not the theory of ecological fallacy holds true in this case.^ The results of the study show that the equations established are highly significant and the independent variables in the equations explain the variation in the demand for hospital admission well. The predictability of these equations is good when they are applied to areas at the same ecological level but become poor, predominantly due to ecological fallacy, when they are applied to health service areas.^ It is concluded that HSAs can not predict hospital admissions by diagnosis without primary data collection as discouraged by Public Law 93-641. ^
Resumo:
The efficacy of waste stabilization lagoons for the treatment of five priority pollutants and two widely used commercial compounds was evaluated in laboratory model ponds. Three ponds were designed to simulate a primary anaerobic lagoon, a secondary facultative lagoon, and a tertiary aerobic lagoon. Biodegradation, volatilization, and sorption losses were quantified for bis(2-chloroethyl) ether, benzene, toluene, naphthalene, phenanthrene, ethylene glycol, and ethylene glycol monoethyl ether. A statistical model using a log normal transformation indicated biodegradation of bis(2-chloroethyl) ether followed first-order kinetics. Additionally, multiple regression analysis indicated biochemical oxygen demand was the water quality variable most highly correlated with bis(2-chloroethyl) ether effluent concentration. ^
Resumo:
The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^
Resumo:
We conducted a nested case-control study to determine the significant risk factors for developing encephalitis from West Nile virus (WNV) infection. The purpose of this research project was to expand the previously published Houston study of 2002–2004 patients to include data on Houston patients from four additional years (2005–2008) to determine if there were any differences in risk factors shown to be associated with developing the more severe outcomes of WNV infection, encephalitis and death, by having this larger sample size. A re-analysis of the risk factors for encephalitis and death was conducted on all of the patients from 2002–2008 and was the focus of this proposed research. This analysis allowed for the determination to be made that there are differences in the outcome in the risk factors for encephalitis and death with an increased sample size. Retrospective medical chart reviews were completed for the 265 confirmed WNV hospitalized patients; 153 patients had encephalitis (WNE), 112 had either viral syndrome with fever (WNF) or meningitis (WNM); a total of 22 patients died. Univariate logistic regression analyses on demographic, comorbidities, and social risk factors was conducted in a similar manner as in the previously conducted study to determine the risk factors for developing encephalitis from WNV. A multivariate model was developed by using model building strategies for the multivariate logistic regression analysis. The hypothesis of this study was that there would be additional risk factors shown to be significant with the increase in sample size of the dataset. This analysis with a greater sample size and increased power supports the hypothesis in that there were additional risk factors shown to be statistically associated with the more severe outcomes of WNV infection (WNE or death). Based on univariate logistic regression results, these data showed that even though age of 20–44 years was statistically significant as a protecting effect for developing WNE in the original study, the expanded sample lacked significance. This study showed a significant WNE risk factor to be chronic alcohol abuse, when it was not significant in the original analysis. Other WNE risk factors identified in this analysis that showed to be significant but were not significant in the original analysis were cancer not in remission > 5 years, history of stroke, and chronic renal disease. When comparing the two analyses with death as an outcome, two risk factors that were shown to be significant in the original analysis but not in the expanded dataset analysis were diabetes mellitus and immunosuppression. Three risk factors shown to be significant in this expanded analysis but were not significant in the original study were illicit drug use, heroin or opiate use, and injection drug use. However, with the multiple logistic regression models, the same independent risk factors for developing encephalitis of age and history of hypertension including drug induced hypertension were consistent in both studies.^
Resumo:
The ascertainment and analysis of adverse reactions to investigational agents presents a significant challenge because of the infrequency of these events, their subjective nature and the low priority of safety evaluations in many clinical trials. A one year review of antibiotic trials published in medical journals demonstrates the lack of standards in identifying and reporting these potentially fatal conditions. This review also illustrates the low probability of observing and detecting rare events in typical clinical trials which include fewer than 300 subjects. Uniform standards for ascertainment and reporting are suggested which include operational definitions of study subjects. Meta-analysis of selected antibiotic trials using multivariate regression analysis indicates that meaningful conclusions may be drawn from data from multiple studies which are pooled in a scientifically rigorous manner. ^