24 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS
em DigitalCommons@The Texas Medical Center
Resumo:
This paper reports a comparison of three modeling strategies for the analysis of hospital mortality in a sample of general medicine inpatients in a Department of Veterans Affairs medical center. Logistic regression, a Markov chain model, and longitudinal logistic regression were evaluated on predictive performance as measured by the c-index and on accuracy of expected numbers of deaths compared to observed. The logistic regression used patient information collected at admission; the Markov model was comprised of two absorbing states for discharge and death and three transient states reflecting increasing severity of illness as measured by laboratory data collected during the hospital stay; longitudinal regression employed Generalized Estimating Equations (GEE) to model covariance structure for the repeated binary outcome. Results showed that the logistic regression predicted hospital mortality as well as the alternative methods but was limited in scope of application. The Markov chain provides insights into how day to day changes of illness severity lead to discharge or death. The longitudinal logistic regression showed that increasing illness trajectory is associated with hospital mortality. The conclusion is reached that for standard applications in modeling hospital mortality, logistic regression is adequate, but for new challenges facing health services research today, alternative methods are equally predictive, practical, and can provide new insights. ^
Resumo:
Despite the availability of hepatitis B vaccine for over two decades, drug users and other high-risk adult populations have experienced low vaccine coverage. Poor compliance has limited efforts to reduce transmission of hepatitis B infection in this population. Evidence suggests that immunological response in drug users is impaired compared to the general population, both in terms of lower seroprotection rates and antibodies levels.^ The current study investigated the effectiveness of the multi-dose hepatitis B vaccine and compared the effect of the standard and accelerated vaccine schedules in a not-in-treatment, drug-using adult population in the city of Houston, USA.^ A population of drug-users from two communities in Houston, susceptible to hepatitis B, was sampled by outreach workers and referral methodology. Subjects were randomized either to the standard hepatitis vaccine schedule (0, 1-, 6-month) or to an accelerated schedule (0, 1-, 2-month). Antibody levels were detected through laboratory analyses at various time-points. The participants were followed for two years and seroconversion rates were calculated to determine immune response.^ A four percent difference in the overall compliance rate was observed between the standard (73%) and accelerated schedules (77%). Logistic regression analyses showed that drug users living on the streets were twice as likely to not complete all three vaccine doses (p=0.028), and current speedball use was also associated with non-completion (p=0.002). Completion of all three vaccinations in the multivariate analysis was also correlated with older age. Drug users on the accelerated schedule were 26% more likely to achieve completion, although this factor was marginally significant (p=0.085).^ Cumulative adequate protective response was gained by 65% of the HBV susceptible subgroup by 12-months and was identical for both the standard and accelerated schedules. Excess protective response (>=100 mIU/mL) occurred with greater frequency at the later period for the standard schedule (36% at 12-months compared to 14% at six months), while the greater proportion of excess protective response for the accelerated schedule occurred earlier (34% at 6 months compared to 18% at 12-months). Seroconversion at the adequate protective response level of 10 mIU/mL was reached by the accelerated schedule group at a quicker rate (62% vs. 49%), and with a higher mean titer (104.8 vs. 64.3 mIU/mL), when measured at six months. Multivariate analyses indicated a 63% increased risk of non-response for older age and confirmed the existence of an accelerating decline in immune response to vaccination manifesting after 40 years (p=0.001). Injecting more than daily was also highly associated with the risk of non-response (p=0.016).^ The substantial increase in the seroprotection rate at six months may be worth the trade-off against the faster antibody titer decrease and is recommended for enhancing compliance and seroconversion. Utilization of the accelerated schedule with the primary objective of increasing compliance and seroconversion rates during the six months after the first dose may confer early protective immunity and reduce the HBV vulnerability of drug users who continue, or have recently initiated, increased high risk drug use and sexual behaviors.^
Resumo:
Chronic β-blocker treatment improves survival and left ventricular ejection fraction (LVEF) in patients with systolic heart failure (HF). Data on whether the improvement in LVEF after β-blocker therapy is sustained for a long term or whether there is a loss in LVEF after an initial gain is not known. Our study sought to determine the prevalence and prognostic role of secondary decline in LVEF in chronic systolic HF patients on β-blocker therapy and characterize these patients. Retrospective chart review of HF hospitalizations fulfilling Framingham Criteria was performed at the MEDVAMC between April 2000 and June 2006. Follow up vital status and recurrent hospitalizations were ascertained until May 2010. Three groups of patients were identified based on LVEF response to beta blockers; group A with secondary decline in LVEF following an initial increase, group B with progressive increase in LVEF and group C with progressive decline in LVEF. Covariate adjusted Cox proportional hazard models were used to examine differences in heart failure re-hospitalizations and all cause mortality between the groups. Twenty five percent (n=27) of patients had a secondary decline in LVEF following an initial gain. The baseline, peak and final LVEF in this group were 27.6±12%, 40.1±14% and 27.4±13% respectively. The mean nadir LVEF after decline was 27.4±13% and this decline occurred at a mean interval of 2.8±1.9 years from the day of beta blocker initiation. These patients were older, more likely to be whites, had advanced heart failure (NYHA class III/IV) more due to a non ischemic etiology compared to groups B & C. They were also more likely to be treated with metoprolol (p=0.03) compared to the other two groups. No significant differences were observed in combined risk of all cause mortality and HF re-hospitalization [hazard ratio 0.80, 95% CI 0.47 to 1.38, p=0.42]. No significant difference was observed in survival estimates between the groups. In conclusion, a late decline in LVEF does occur in a significant proportion of heart failure patients treated with beta blockers, more so in patients treated with metoprolol.^
Resumo:
The standard analyses of survival data involve the assumption that survival and censoring are independent. When censoring and survival are related, the phenomenon is known as informative censoring. This paper examines the effects of an informative censoring assumption on the hazard function and the estimated hazard ratio provided by the Cox model.^ The limiting factor in all analyses of informative censoring is the problem of non-identifiability. Non-identifiability implies that it is impossible to distinguish a situation in which censoring and death are independent from one in which there is dependence. However, it is possible that informative censoring occurs. Examination of the literature indicates how others have approached the problem and covers the relevant theoretical background.^ Three models are examined in detail. The first model uses conditionally independent marginal hazards to obtain the unconditional survival function and hazards. The second model is based on the Gumbel Type A method for combining independent marginal distributions into bivariate distributions using a dependency parameter. Finally, a formulation based on a compartmental model is presented and its results described. For the latter two approaches, the resulting hazard is used in the Cox model in a simulation study.^ The unconditional survival distribution formed from the first model involves dependency, but the crude hazard resulting from this unconditional distribution is identical to the marginal hazard, and inferences based on the hazard are valid. The hazard ratios formed from two distributions following the Gumbel Type A model are biased by a factor dependent on the amount of censoring in the two populations and the strength of the dependency of death and censoring in the two populations. The Cox model estimates this biased hazard ratio. In general, the hazard resulting from the compartmental model is not constant, even if the individual marginal hazards are constant, unless censoring is non-informative. The hazard ratio tends to a specific limit.^ Methods of evaluating situations in which informative censoring is present are described, and the relative utility of the three models examined is discussed. ^
Resumo:
We investigated cross-sectional associations between intakes of zinc, magnesium, heme- and non heme iron, beta-carotene, vitamin C and vitamin E and inflammation and subclinical atherosclerosis in the Multi-Ethnic Study of Atherosclerosis (MESA). We also investigated prospective associations between those micronutrients and incident MetS, T2D and CVD. Participants between 45-84 years of age at baseline were followed between 2000 and 2007. Dietary intake was assessed at baseline using a 120-item food frequency questionnaire. Multivariable linear regression and Cox proportional hazard regression models were used to evaluate associations of interest. Dietary intakes of non-heme iron and Mg were inversely associated with tHcy concentrations (geometric means across quintiles: 9.11, 8.86, 8.74, 8.71, and 8.50 µmol/L for non-heme iron, and 9.20, 9.00, 8.65, 8.76, and 8.33 µmol/L for Mg; ptrends <0.001). Mg intake was inversely associated with high CC-IMT; odds ratio (95% CI) for extreme quintiles 0.76 (0.58, 1.01), ptrend: 0.002. Dietary Zn and heme-iron were positively associated with CRP (geometric means: 1.73, 1.75, 1.78, 1.88, and 1.96 mg/L for Zn and 1.72, 1.76, 1.83, 1.86, and 1.94 mg/L for heme-iron). In the prospective analysis, dietary vitamin E intake was inversely associated with incident MetS and with incident CVD (HR [CI] for extreme quintiles - MetS: 0.78 [0.62-0.97] ptrend=0.01; CVD: 0.69 [0.46-1.03]; ptrend =0.04). Intake of heme-iron from red meat and Zn from red meat, but not from other sources, were each positively associated with risk of CVD (HR [CI] - heme-iron from red meat: 1.65 [1.10-2.47] ptrend = 0.01; Zn from red meat: 1.51 [1.02 - 2.24] ptrend =0.01) and MetS (HR [CI] - heme-iron from red meat: 1.25 [0.99-1.56] ptrend =0.03; Zn from red meat: 1.29 [1.03-1.61]; ptrend = 0.04). All associations evaluated were similar across different strata of gender, race-ethnicity and alcohol intake. Most of the micronutrients investigated were not associated with the outcomes of interest in this multi-ethnic cohort. These observations do not provide consistent support for the hypothesized association of individual nutrients with inflammatory markers, MetS, T2D, or CVD. However, nutrients consumed in red meat, or consumption of red meat as a whole, may increase risk of MetS and CVD.^
Resumo:
Multiple studies have shown an association between periodontitis and coronary heart disease due to the chronic inflammatory nature of periodontitis. Also, studies have indicated similar risk factors and patho-physiologic mechanisms for periodontitis and CHD. Among these factors, smoking has been the most discussed common risk factor and some studies suggested the periodontitis - CHD association to be largely a result of confounding due to smoking or inadequate adjustment for it. We conducted a secondary data analysis of the Dental ARIC Study, an ancillary study to the ARIC Study, to evaluate the effect of smoking on the periodontitis - CHD association using three periodontitis classifications namely, BGI, AAP-CDC, and Dental-ARIC classification (Beck et al 2001). We also compared these results with edentulous ARIC participants. Using Cox proportional hazard models, we found that the individuals with the most severe form of periodontitis in each of the three classifications (BGI: HR = 1.56, 95%CI: 1.15 – 2.13; AAP-CDC: HR = 1.42, 95%CI: 1.13 – 1.79; and Dental-ARIC: HR = 1.49, 95%CI: 1.22 – 1.83) were at a significantly higher risk of incident CHD in the unadjusted models; whereas only BGI-P3 showed statistically significant increased risk in the smoking adjusted models (HR = 1.43, 95%CI: 1.04 – 1.96). However none of the categories in any of the classifications showed significant association when a list of traditional CHD risk factors was introduced into the models. On the other hand, edentulous participants showed significant results when compared to the dentate ARIC participants in the crude (HR = 1.56, 95%CI: 1.34 – 1.82); smoking adjusted (HR = 1.39, 95%CI: 1.18 – 1.64) age, race and sex adjusted (HR = 1.52, 95%CI: 1.30 – 1.77); and ARIC traditional risk factors (except smoking) adjusted (HR = 1.27, 95%CI: 1.02 – 1.57) models. Also, the risk remained significantly higher even when smoking was introduced in the age, sex and race adjusted model (HR = 1.38, 95%CI: 1.17 – 1.63). Smoking did not reduce the hazard ratio by more than 8% when it was included in any of the Cox models. ^ This is the first study to include the three most recent case definitions of periodontitis simultaneously while looking at its association with incident coronary heart disease. We found smoking to be partially confounding the periodontitis and coronary heart disease association and edentulism to be significantly associated with incident CHD even after adjusting for smoking and the ARIC traditional risk factors. The difference in the three periodontitis classifications was not found to be statistical significant when they were tested for equality of the area under their ROC curves but this should not be confused with their clinical significance.^
Resumo:
Based on the World Health Organization's (1965) definition of health, understanding of health requires understanding of positive psychological states. Subjective Well-being (SWB) is a major indicator of positive psychological states. Up to date, most studies of SWB have been focused on its distributions and determinants. However, study of its consequences, especially health consequences, is lacking. This dissertation research examined Subjective Well-being, as operationally defined by constructs drawn from the framework of Positive Psychology, and its sub-scores (Positive Feelings and Negative Feelings) as predictors of three major health outcomes—mortality, heart disease, and obesity. The research used prospective data from the Alameda County Study over 29 years (1965–1994), based on a stratified, randomized, representative sample of the general public in Alameda County, California (Baseline N = 6928). ^ Multivariate analyses (Survival analyses using sequential Cox Proportional Hazard models in the cases of mortality and heart disease, and sequential Logistic Regression analyses in the case of obesity) were performed as the main methods to evaluate the associations of the predictors and the health outcomes. The results revealed that SWB reduced risks of all-cause mortality, natural-cause mortality, and cardiovascular mortality. Positive feelings not only had an even stronger protective effect against all-cause, natural-cause and cardiovascular mortality, but also predicted decreased unnatural-cause mortality which includes deaths from suicide, homicide, accidents, mental disorders, drug dependency, as well as alcohol-related liver diseases. These effects were significant even after adjusted for age, gender, education, and various physical health measures, and, in the case of cardiovascular mortality, obesity and health practices (alcohol consumption, smoking, and physical activities). However, these two positive psychological indicators, SWB and positive feelings, did not predict obesity. And negative feelings had no significant effect on any of the health outcomes evaluated, i.e., all-cause mortality, natural- and unnatural-cause mortality, cardiovascular mortality, or obesity, after covariates were controlled. These findings were discussed (1) in comparison with relevant existing studies, (2) in terms of their implications in health research and promotion, (3) in terms of the independence of positive and negative feelings, and (4) from a Positive Psychology perspective and its significance in Public Health research and practice. ^
Resumo:
Methicillin (meticillin)-susceptible Staphylococcus aureus (MSSA) strains producing large amounts of type A beta-lactamase (Bla) have been associated with cefazolin failures, but the frequency and impact of these strains have not been well studied. Here we examined 98 MSSA clinical isolates and found that 26% produced type A Bla, 15% type B, 46% type C, and none type D and that 13% lacked blaZ. The cefazolin MIC(90) was 2 microg/ml for a standard inoculum and 32 microg/ml for a high inoculum, with 19% of isolates displaying a pronounced inoculum effect (MICs of >or=16 microg/ml with 10(7) CFU/ml) (9 type A and 10 type C Bla producers). At the high inoculum, type A producers displayed higher cefazolin MICs than type B or C producers, while type B and C producers displayed higher cefamandole MICs. Among isolates from hemodialysis patients with MSSA bacteremia, three from the six patients who experienced cefazolin failure showed a cefazolin inoculum effect, while none from the six patients successfully treated with cefazolin showed an inoculum effect, suggesting an association between these strains and cefazolin failure (P = 0.09 by Fisher's exact test). In summary, 19% of MSSA clinical isolates showed a pronounced inoculum effect with cefazolin, a phenomenon that could explain the cases of cefazolin failure previously reported for hemodialysis patients with MSSA bacteremia. These results suggest that for serious MSSA infections, the presence of a significant inoculum effect with cefazolin could be associated with clinical failure in patients treated with this cephalosporin, particularly when it is used at low doses.
Resumo:
A UV-induced mutation of the enzyme glyceraldehyde-3-phosphate dehydrogenase (GAPD) was characterized in the CHO clone A24. The asymmetric 4-banded zymogram and an in vitro GAPD activity equal to that of wild type cells were not consistent with models of a mutant heterozygote producing equal amounts of wild type and either catalytically active or inactive mutant subunits that interacted randomly. Cumulative evidence indicated that the site of the mutation was the GAPD structural locus expressed in CHO wild type cells, and that the mutant allele coded for a subunit that differed from the wild type subunit in stability and kinetics. The evidence included the appearance of a fifth band, the putative mutant homotetramer, after addition of the substrate glyceraldehyde-3-phosphate (GAP) to the gel matrix; dilution experiments indicating stability differences between the subunits; experiments with subsaturating levels of GAP indicating differences in affinity for the substrate; GAPD zymograms of A24 x mouse hybrids that were consistent with the presence of two distinct A24 subunits; independent segregation of A24 wild type and mutant electrophoretic bands from the hybrids, which was inconsistent with models of mutation of a locus involved in posttranslational modification; the mapping of both wild type and mutant forms of GAPD to chromosome 8; and the failure to detect any evidence of posttranslational modification (of other A24 isozymes, or through mixing of homogenates of A24 and mouse).^ The extent of skewing of the zymogram toward the wild type band, and the unreduced in vitro activity were inconsistent with models based solely on differences in activity of the two subunits. Comparison of wild type homotetramer bands in wild type cells and A24 suggested the latter had a preponderance of wild type subunits over mutant subunits, and had more GAPD tetramers than did CHO controls.^ Two CHO linkages, GAPD-triose phosphate isomerase, and acid phosphatase 2-adenosine deaminase were reported provisionally, and several others were confirmed. ^
Resumo:
The objective of this study was to determine the impact of different follow-up cystoscopy frequencies on time to development of invasive bladder cancer in a cohort of 3,658 eligible patients 65 and older with an initial diagnosis of superficial bladder cancer between 1994 and 1998. Bladder cancer patients in the Surveillance, Epidemiology, and End Results (SEER)-Medicare database were used as the study population. ^ It was hypothesized that superficial bladder cancer patients receiving less frequent cystoscopy follow-up would develop invasive bladder cancer sooner after initial diagnosis and treatment than patients seen more frequently for cystoscopy follow-up. Cox Proportional Hazard Regression revealed that patients seen for cystoscopy every 3 or more months were 83–89% less likely to develop invasive cancer than patients seen every 1 to 2 months. A comparison of the 2 groups (1 to 2 months vs. 3≥ months) revealed that the 1 to 2 month group may have had more aggressive disease, and they are seen more frequently as a result. ^ These findings suggest that there are two groups of superficial bladder cancer patients: those at high risk of developing invasive bladder cancer and those at low risk. Patients who developed invasive bladder cancer sooner after initial diagnosis and treatment were seen more frequently for cystoscopy follow-up. The recommendation is that cystoscopy should be based on disease status at 3 months. Standardized schedules give all patients the same number of cystoscopies regardless of their risk factors. This could lead to unnecessary cystoscopies in low risk patients, and fewer than optimal cystoscopies in high risk patients. ^
Resumo:
The ordinal logistic regression models are used to analyze the dependant variable with multiple outcomes that can be ranked, but have been underutilized. In this study, we describe four logistic regression models for analyzing the ordinal response variable. ^ In this methodological study, the four regression models are proposed. The first model uses the multinomial logistic model. The second is adjacent-category logit model. The third is the proportional odds model and the fourth model is the continuation-ratio model. We illustrate and compare the fit of these models using data from the survey designed by the University of Texas, School of Public Health research project PCCaSO (Promoting Colon Cancer Screening in people 50 and Over), to study the patient’s confidence in the completion colorectal cancer screening (CRCS). ^ The purpose of this study is two fold: first, to provide a synthesized review of models for analyzing data with ordinal response, and second, to evaluate their usefulness in epidemiological research, with particular emphasis on model formulation, interpretation of model coefficients, and their implications. Four ordinal logistic models that are used in this study include (1) Multinomial logistic model, (2) Adjacent-category logistic model [9], (3) Continuation-ratio logistic model [10], (4) Proportional logistic model [11]. We recommend that the analyst performs (1) goodness-of-fit tests, (2) sensitivity analysis by fitting and comparing different models.^
Resumo:
Hereditary nonpolyposis colorectal cancer (HNPCC) is an autosomal dominant disease caused by germline mutations in DNA mismatch repair(MMR) genes. The nucleotide excision repair(NER) pathway plays a very important role in cancer development. We systematically studied interactions between NER and MMR genes to identify NER gene single nucleotide polymorphism (SNP) risk factors that modify the effect of MMR mutations on risk for cancer in HNPCC. We analyzed data from polymorphisms in 10 NER genes that had been genotyped in HNPCC patients that carry MSH2 and MLH1 gene mutations. The influence of the NER gene SNPs on time to onset of colorectal cancer (CRC) was assessed using survival analysis and a semiparametric proportional hazard model. We found the median age of onset for CRC among MMR mutation carriers with the ERCC1 mutation was 3.9 years earlier than patients with wildtype ERCC1(median 47.7 vs 51.6, log-rank test p=0.035). The influence of Rad23B A249V SNP on age of onset of HNPCC is age dependent (likelihood ratio test p=0.0056). Interestingly, using the likelihood ratio test, we also found evidence of genetic interactions between the MMR gene mutations and SNPs in ERCC1 gene(C8092A) and XPG/ERCC5 gene(D1104H) with p-values of 0.004 and 0.042, respectively. An assessment using tree structured survival analysis (TSSA) showed distinct gene interactions in MLH1 mutation carriers and MSH2 mutation carriers. ERCC1 SNP genotypes greatly modified the age onset of HNPCC in MSH2 mutation carriers, while no effect was detected in MLH1 mutation carriers. Given the NER genes in this study play different roles in NER pathway, they may have distinct influences on the development of HNPCC. The findings of this study are very important for elucidation of the molecular mechanism of colon cancer development and for understanding why some mutation carriers of the MSH2 and MLH1 gene develop CRC early and others never develop CRC. Overall, the findings also have important implications for the development of early detection strategies and prevention as well as understanding the mechanism of colorectal carcinogenesis in HNPCC. ^
Resumo:
Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^
Resumo:
Introduction. 3-hydroxy-3-methylglutaryl CoA reductase inhibitor ("statin") have been widely used for hypercholesteroremia and Statin induced myopathy is well known. Whether Statins contribute to exacerbation of Myasthenia Gravis (MG) requiring hospitalization is not well known. ^ Objectives. To determine the frequency of statin use in patients with MG seen at the neuromuscular division at University of Alabama in Birmingham (UAB) and to evaluate any association between use of statins and MG exacerbations requiring hospitalization in patients with an established diagnosis of Myasthenia Gravis. ^ Methods. We reviewed records of all current MG patients at the UAB neuromuscular department to obtain details on use of statins and any hospitalizations due to exacerbation of MG over the period from January 1, 2003 to December 31, 2006. ^ Results. Of the 113 MG patients on whom information was available for this period, 40 were on statins during at least one clinic visit. Statin users were more likely to be older (mean age 60.2 vs 53.8, p = 0.029), male (70.0% vs 43.8%, p = 0.008), and had a later onset of myasthenia gravis (mean age in years at onset 49.8 versus 42.9, p = 0.051). The total number of hospitalizations or the proportion of subjects who had at least one hospitalization during the study period did not differ in the statin versus no-statin group. However, when hospitalizations which occurred from a suspected precipitant were excluded ("event"), the proportion of subjects who had at least one such event during the study period was higher in the group using statins. In the final Cox proportional hazard model for cumulative time to event, statin use (OR = 6.44, p <0.01) and baseline immunosuppression (OR = 3.03, p = 0.07) were found to increase the odds of event. ^ Conclusions. Statin use may increase the rate of hospitalizations due to MG exacerbation, when excluding exacerbations precipitated by other suspected factors.^
Resumo:
Objectives. Predict who will develop a dissection. To create male and female prediction models using the risk factors: age, ethnicity, hypertension, high cholesterol, smoking, alcohol use, diabetes, heart attack, congestive heart failure, congenital and non-congenital heart disease, Marfan syndrome, and bicuspid aortic valve. ^ Methods. Using 572 patients diagnosed with aortic aneurysms, a model was developed for each of males and females using 80% of the data and then verified using the remaining 20% of the data. ^ Results. The male model predicted the probability of a male in having a dissection (p=0.076) and the female model predicted the probability of a female in having a dissection (p=0.054). The validation models did not support the choice of the developmental models. ^ Conclusions. The best models obtained suggested that those who are at a greater risk of having a dissection are males with non-congenital heart disease and who drink alcohol, and females with non-congenital heart disease and bicuspid aortic valve.^