18 resultados para RISK PATIENTS
em DigitalCommons@The Texas Medical Center
Resumo:
OBJECTIVE: The objective of this study was to evaluate the impact of newer therapies on the highest risk patients with congenital diaphragmatic hernia (CDH), those with agenesis of the diaphragm. SUMMARY BACKGROUND DATA: CDH remains a significant cause of neonatal mortality. Many novel therapeutic interventions have been used in these infants. Those children with large defects or agenesis of the diaphragm have the highest mortality and morbidity. METHODS: Twenty centers from 5 countries collected data prospectively on all liveborn infants with CDH over a 10-year period. The treatment and outcomes in these patients were examined. Patients were followed until death or hospital discharge. RESULTS: A total of 1,569 patients with CDH were seen between January 1995 and December 2004 in 20 centers. A total of 218 patients (14%) had diaphragmatic agenesis and underwent repair. The overall survival for all patients was 68%, while survival was 54% in patients with agenesis. When patients with diaphragmatic agenesis from the first 2 years were compared with similar patients from the last 2 years, there was significantly less use of ECMO (75% vs. 52%) and an increased use of inhaled nitric oxide (iNO) (30% vs. 80%). There was a trend toward improved survival in patients with agenesis from 47% in the first 2 years to 59% in the last 2 years. The survivors with diaphragmatic agenesis had prolonged hospital stays compared with patients without agenesis (median, 68 vs. 30 days). For the last 2 years of the study, 36% of the patients with agenesis were discharged on tube feedings and 22% on oxygen therapy. CONCLUSIONS: There has been a change in the management of infants with CDH with less frequent use of ECMO and a greater use of iNO in high-risk patients with a potential improvement in survival. However, the mortality, hospital length of stay, and morbidity in agenesis patients remain significant.
Resumo:
The increased use of vancomycin in hospitals has resulted in a standard practice to monitor serum vancomycin levels because of possible nephrotoxicity. However, the routine monitoring of vancomycin serum concentration is under criticism and the cost effectiveness of such routine monitoring is in question because frequent monitoring neither results in increase efficacy nor decrease nephrotoxicity. The purpose of the present study is to determine factors that may place patients at increased risk of developing vancomycin induced nephrotoxicity and for whom monitoring may be most beneficial.^ From September to December 1992, 752 consecutive in patients at The University of Texas M. D. Anderson Cancer Center, Houston, were prospectively evaluated for nephrotoxicity in order to describe predictive risk factors for developing vancomycin related nephrotoxicity. Ninety-five patients (13 percent) developed nephrotoxicity. A total of 299 patients (40 percent) were considered monitored (vancomycin serum levels determined during the course of therapy), and 346 patients (46 percent) were receiving concurrent moderate to highly nephrotoxic drugs.^ Factors that were found to be significantly associated with nephrotoxicity in univariate analysis were: gender, base serum creatinine greater than 1.5mg/dl, monitor, leukemia, concurrent moderate to highly nephrotoxic drugs, and APACHE III scores of 40 or more. Significant factors in the univariate analysis were then entered into a stepwise logistic regression analysis to determine independent predictive risk factors for vancomycin induced nephrotoxicity.^ Factors, with their corresponding odds ratios and 95% confidence limits, selected by stepwise logistic regression analysis to be predictive of vancomycin induced nephrotoxicity were: Concurrent therapy with moderate to highly nephrotoxic drugs (2.89; 1.76-4.74), APACHE III scores of 40 or more (1.98; 1.16-3.38), and male gender (1.98; 1.04-2.71).^ Subgroup (monitor and non-monitor) analysis showed that male (OR = 1.87; 95% CI = 1.01, 3.45) and moderate to highly nephrotoxic drugs (OR = 4.58; 95% CI = 2.11, 9.94) were significant for nephrotoxicity in monitored patients. However, only APACHE III score (OR = 2.67; 95% CI = 1.13,6.29) was significant for nephrotoxicity in non-monitored patients.^ The conclusion drawn from this study is that not every patient receiving vancomycin therapy needs frequent monitoring of vancomycin serum levels. Such routine monitoring may be appropriate in patients with one or more of the identified risk factors and low risk patients do not need to be subjected to the discomfort and added cost of multiple blood sampling. Such prudent selection of patients to monitor may decrease cost to patients and hospital. ^
Resumo:
The objective of this study was to determine the impact of different follow-up cystoscopy frequencies on time to development of invasive bladder cancer in a cohort of 3,658 eligible patients 65 and older with an initial diagnosis of superficial bladder cancer between 1994 and 1998. Bladder cancer patients in the Surveillance, Epidemiology, and End Results (SEER)-Medicare database were used as the study population. ^ It was hypothesized that superficial bladder cancer patients receiving less frequent cystoscopy follow-up would develop invasive bladder cancer sooner after initial diagnosis and treatment than patients seen more frequently for cystoscopy follow-up. Cox Proportional Hazard Regression revealed that patients seen for cystoscopy every 3 or more months were 83–89% less likely to develop invasive cancer than patients seen every 1 to 2 months. A comparison of the 2 groups (1 to 2 months vs. 3≥ months) revealed that the 1 to 2 month group may have had more aggressive disease, and they are seen more frequently as a result. ^ These findings suggest that there are two groups of superficial bladder cancer patients: those at high risk of developing invasive bladder cancer and those at low risk. Patients who developed invasive bladder cancer sooner after initial diagnosis and treatment were seen more frequently for cystoscopy follow-up. The recommendation is that cystoscopy should be based on disease status at 3 months. Standardized schedules give all patients the same number of cystoscopies regardless of their risk factors. This could lead to unnecessary cystoscopies in low risk patients, and fewer than optimal cystoscopies in high risk patients. ^
Resumo:
According to the United Nations Program on HIV/AIDS (UNAIDS, 2008), in 2007 about 67 per cent of all HIV-infected patients in the world were in Sub-Saharan Africa, with 35% of new infections and 38% of the AIDS deaths occurring in Southern Africa. Globally, the number of children younger than 15 years of age infected with HIV increased from 1.6 million in 2001 to 2.0 million in 2007 and almost 90% of these were in Sub-Saharan Africa. (UNAIDS, 2008).^ Both clinical and laboratory monitoring of children on Highly Active Anti-Retroviral Therapy (HAART) are important and necessary to optimize outcomes. Laboratory monitoring of HIV viral load and genotype resistance testing, which are important in patient follow-up to optimize treatment success, are both generally expensive and beyond the healthcare budgets of most developing countries. This is especially true for the impoverished Sub-Saharan African nations. It is therefore important to identify those factors that are associated with virologic failure in HIV-infected Sub-Saharan African children. This will inform practitioners in these countries so that they can predict which patients are more likely to develop virologic failure and therefore target the limited laboratory monitoring budgets towards these at-risk patients. The objective of this study was to examine those factors that are associated with virologic failure in HIV-infected children taking Highly Active Anti-retroviral Therapy in Botswana, a developing Sub-Saharan African country. We examined these factors in a Case-Control study using medical records of HIV-infected children and adolescents on HAART at the Botswana-Baylor Children's Clinical Center of Excellence (BBCCCOE) in Gaborone, Botswana. Univariate and Multivariate Regression Analyses were performed to identify predictors of virologic failure in these children.^ The study population comprised of 197 cases (those with virologic failure) and 544 controls (those with virologic success) with ages ranging from 3 months to 16 years at baseline. Poor adherence (pill count <95% on at least 3 consecutive occasions) was the strongest independent predictor of virologic failure (adjusted OR = 269.97, 95% CI = 104.13 to 699.92; P < 0.001). Other independent predictors of virologic failure identified were: First Line NNRTI with Nevirapine (OR = 2.99, 95% CI = 1.19 to7.54; P = 0.020), Baseline HIV-1 Viral Load >750,000/ml (OR = 257, 95% CI = 1.47 to 8.63; P = 0.005), Positive History of PMTCT (OR = 11.65, 95% CI = 3.04-44.57; P < 0.001), Multiple Care-givers (>=3) (OR = 2.56, 95% CI = 1.06 to 6.19; P = 0.036) and Residence in a Village (OR = 2.85, 95% CI = 1.36 to 5.97; P = 0.005).^ The results of this study may help to improve virologic outcomes and reduce the costs of caring for HIV-infected children in resource-limited settings. ^ Keywords: Virologic Failure, Highly Active Anti-Retroviral Therapy, Sub-Saharan Africa, Children, Adherence.^
Resumo:
Coronary perfusion with thrombolytic therapy and selective reperfusion by percutaneous transluminal coronary angioplasty (PTCA) were examined in the Corpus Christi Heart Project, a population-based surveillance program for hospitalized acute myocardial infarction (MI) patients in a biethnic community of Mexican-Americans (MAs) and non-Hispanic whites (NHWs). Results were based on 250 (12.4%) patients who received thromobolytic therapy in a cohort of 2011 acute MI cases. Out of these 107 (42.8%) underwent PTCA with a mean follow-up of 25 months. There were 186 (74.4%) men and 64 (25.6%) women; 148 (59.2%) were NHWs, 86 (34.4%) were MAs. Thrombolysis and PTCA were performed less frequently in women than in men, and less frequently in MAs than in NHWs.^ According to the coronary reperfusion interventions used, patients were divided in two groups, those that received no-PTCA (57.2%) and the other that underwent PTCA (42.8%) after thrombolysis. The case-fatality rate was higher in no-PTCA patients than in the PTCA (7.7% versus 5.6%), as was mortality at one year (16.2% versus 10.5%). Reperfusion was successful in 48.0% in the entire cohort and (51.4% versus 45.6%) in the PTCA and no-PTCA groups. Mortality in the successful reperfusion patients was 5.0% compared to 22.3% in the unsuccessful reperfusion group (p = 0.00016, 95% CI: 1.98-11.6).^ Cardiac catheterization was performed in 86.4% thrombolytic patients. Severe stenosis ($>$75%) obstruction was present most commonly in the left descending artery (52.8%) and in the right coronary artery (52.8%). The occurrence of adverse in-hospital clinical events was higher in the no-PTCA as compared to the PTCA and catheterized patients with the exception of reperfusion arrythmias (p = 0.140; Fisher's exact test p = 0.129).^ Cox regression analysis was used to study the relationship between selected variables and mortality. Apart from successful reperfusion, age group (p = 0.028, 95% CI: 2.1-12.42), site of acute MI index (p = 0.050) and ejection-fraction (p = 0.052) were predictors of long-term survival. The ejection-fraction in the PTCA group was higher than (median 78% versus 53%) in the no-PTCA group. Assessed by logistic regression analysis history of high cholesterol ($>$200mg/dl) and diabetes mellites did have significant prognostic value (p = 0.0233; p = 0.0318) in long-term survival irrespective of treatment status.^ In conclusion, the results of this study support the idea that the use of PTCA as a selective intervention following thrombolysis improves survival of patients with acute MI. The use of PTCA in this setting appears to be safe. However, we can not exclude the possibility that some of these results may have occurred due to the exclusion from PTCA of high risk patients (selection bias). ^
Resumo:
OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^
Resumo:
Cancer is a chronic disease that often necessitates recurrent hospitalizations, a costly pattern of medical care utilization. In chronically ill patients, most readmissions are for treatment of the same condition that caused the preceding hospitalization. There is concern that rather than reducing costs, earlier discharge may shift costs from the initial hospitalization to emergency center visits. ^ This is the first descriptive study to measure the incidence of emergency center visits (ECVs) after hospitalization at The University of M. D. Anderson Cancer Center (UTMDACC), to identify the risk factors for and outcomes of these ECVs, and to compare 30-day all-cause mortality and costs for episodes of care with and without ECVs. ^ We identified all hospitalizations at UTMDACC with admission dates from September 1, 1993 through August 31, 1997 which met inclusion criteria. Data were electronically obtained primarily from UTMDACC's institutional database. Demographic factors, clinical factors, duration of the index hospitalization, method of payment for care, and year of hospitalization study were variables determined for each hospitalization. ^ The overall incidence of ECVs was 18%. Forty-five percent of ECVs resulted in hospital readmission (8% of all hospitalizations). In 1% of ECVs the patient died in the emergency center, and for the remaining 54% of ECVs the patient was discharged home. Risk factors for ECVs were marital status, type of index hospitalization, cancer type, and duration of the index hospitalization. The overall 30-day all-cause mortality rate was 8.6% for hospitalizations with an ECV and 5.3% for those without an ECV. In all subgroups, the 30-day all-cause mortality rate was higher for groups with ECVs than for those without ECVs. The most important factor increasing cost was having an ECV. In all patient subgroups, the cost per episode of care with an ECV was at least 1.9 times the cost per episode without an ECV. ^ The higher costs and poorer outcomes of episodes of care with ECVs and hospital readmissions suggest that interventions to avoid these ECVs or mitigate their costs are needed. Further research is needed to improve understanding of the methodological issues involved in relation to health care issues for cancer patients. ^
Resumo:
Background. Nosocomial invasive aspergillosis (a highly fatal disease) is an increasing problem for immunocompromised patients. Aspergillus spp. can be transmitted via air (most commonly) and by water. ^ The hypothesis for this prospective study was that there is an association between patient occupancy, housekeeping practices, patients, visitors, and Aspergillus spp. loading. Rooms were sampled as not terminally cleaned (dirty) and terminally cleaned (clean). The secondary hypothesis was that Aspergillus spp. positive samples collected from more than one sampling location within the same patient room represent the same isolate. ^ Methods. Between April and October 2004, 2873 environmental samples (713 air, 607 water, 1256 surface and 297 spore traps) were collected in and around 209 “clean” and “dirty” patient rooms in a large cancer center hospital. Water sources included aerosolized water from patient room showerheads, sinks, drains, and toilets. Bioaerosol samples were from the patient room and from the running shower, flushing toilet, and outside the building. The surface samples included sink and shower drains, showerheads, and air grills. Aspergillus spp. positive samples were also sent for PCR, molecular typing (n = 89). ^ Results. All water samples were negative for Aspergillus spp. There were a total of 130 positive culturable samples (5.1%). The predominant species found was Aspergillus niger. Of the positive culturable samples, 106 (14.9%) were air and 24 (3.8%) were surface. There were 147 spore trap samples, and 49.5% were positive for Aspergillus/Penicillum spp. Of the culturable positive samples sent for PCR, 16 were indistinguishable matches. There was no significant relationship between air and water samples and positive samples from the same room. ^ Conclusion. Primarily patients, visitors and staff bring the Aspergillus spp. into the hospital. The high number of A. niger samples suggests the spores are entering the hospital from outdoors. Eliminating the materials brought to the patient floors from the outside, requiring employees, staff, and visitors to wear cover up over their street clothes, and improved cleaning procedures could further reduce positive samples. Mold strains change frequently; it is probably more significant to understand pathogenicity of viable spores than to commit resources on molecular strain testing on environmental samples alone. ^
Resumo:
Background. Clostridium difficile is the leading cause of hospital associated infectious diarrhea and colitis. About 3 million cases of Clostridium difficile diarrhea occur each year with an annual cost of $1 billion. ^ About 20% of patients acquire C. difficile during hospitalization. Infection with Clostridium difficile can result in serious complications, posing a threat to the patient's life. ^ Purpose. The aim of this research was to demonstrate the uniqueness in the characteristics of C. difficile positive nosocomial diarrhea cases compared with C. difficile negative nosocomial diarrhea controls admitted to a local hospital. ^ Methods. One hundred and ninety patients with a positive test and one hundred and ninety with a negative test for Clostridium difficile nosocomial diarrhea, selected from patients tested between January 1, 2002 and December 31, 2003, comprised the study population. Demographic and clinical data were collected from medical records. Logistic regression analyses were conducted to determine the associated odds between selected variables and the outcome of Clostridium difficile nosocomial diarrhea. ^ Results. For the antibiotic classes, cephalosporins (OR, 1.87; CI 95, 1.23 to 2.85), penicillins (OR, 1.57; CI 95, 1.04 to 2.37), fluoroquinolones (OR, 1.65; CI 95, 1.09 to 2.48) and antifungals (OR, 2.17; CI 95, 1.20 to 3.94), were significantly associated with Clostridium difficile nosocomial diarrhea Ceftazidime (OR, 1.95; CI 95, 1.25 to 3.03, p=0.003), gatifloxacin (OR, 1.97; CI 95, 1.31 to 2.97, p=0.001), clindamycin (OR, 3.13; CI 95, 1.99 to 4.93, p<0.001) and vancomycin (OR, 1.77; CI 95, 1.18 to 2.66, p=0.006, were also significantly associated with the disease. Vancomycin was not statistically significant when analyzed in a multivariable model. Other significantly associated drugs were, antacids, laxatives, narcotics and ranitidine. Prolong use of antibiotics and an increased number of comorbid conditions were also associated with C. difficile nosocomial diarrhea. ^ Conclusion. The etiology for C. difficile diarrhea is multifactorial. Exposure to antibiotics and other drugs, prolonged antibiotic usage, the presence and severity of comorbid conditions and prolonged hospital stay were shown to contribute to the development of the disease. It is imperative that any attempt to prevent the disease, or contain its spread, be done on several fronts. ^
Resumo:
Objective. The objective of this study is to determine the prevalence of MRSA colonization in adult patients admitted to intensive care units at an urban tertiary care hospital in Houston, Texas and to evaluate the risk factors associated with colonization during a three month active-screening pilot project. Design. This study used secondary data from a small cross-sectional pilot project. Methods. All patients admitted to the seven specialty ICUs were screened for MRSA by nasal culture. Results were obtained utilizing the BD GeneOhm™ IDI-MRSA assay in vitro diagnostic test, for rapid MRSA detection. Statistical analysis was performed using the STATA 10, Epi Info, and JavaStat. Results . 1283/1531 (83.4%) adult ICU admissions were screened for nasal MRSA colonization. Of those screened, demographic and risk factor data was available for 1260/1283 (98.2%). Unresolved results were obtained for 73 patients. Therefore, a total of 1187/1531 (77.5%) of all ICU admissions during the three month study period are described in this analysis. Risk factors associated with colonization included the following: hospitalization within the last six months (odds ratio 2.48 [95% CI, 1.70-3.63], p=0.000), hospitalization within the last 12 months, (odds ratio 2.27 [95% CI, 1.57-3.80], p=0.000), and having diabetes mellitus (odds ratio 1.63 [95% CI, 1.14-2.32], p=0.007). Conclusion. Based on the literature, the prevalence of MRSA for this population is typical of other prevalence studies conducted in the United States and coincides with the continual increasing trend of MRSA colonization. Significant risk factors were similar to those found in previous studies. Overall, the active surveillance screening pilot project has provided valuable information on a population not widely addressed. These findings can aid in future interventions for the education, control, prevention, and treatment of MRSA. ^
Resumo:
Background. Over 39.9% of the adult population forty or older in the United States has refractive error, little is known about the etiology of this condition and associated risk factors and their entailed mechanism due to the paucity of data regarding the changes of refractive error for the adult population over time.^ Aim. To evaluate risk factors over a long term, 5-year period, in refractive error changes among persons 43 or older by testing the hypothesis that age, gender, systemic diseases, nuclear sclerosis and baseline refractive errors are all significantly associated with refractive errors changes in patients at a Dallas, Texas private optometric office.^ Methods. A retrospective chart review of subjective refraction, eye health, and self-report health history was done on patients at a private optometric office who were 43 or older in 2000 who had eye examinations both in 2000 and 2005. Aphakic and pseudophakic eyes were excluded as well as eyes with best corrected Snellen visual acuity of 20/40 and worse. After exclusions, refraction was obtained on 114 right eyes and 114 left eyes. Spherical equivalent (sum of sphere + ½ cylinder) was used as the measure of refractive error.^ Results. Similar changes in refractive error were observed for the two eyes. The 5-year change in spherical power was in a hyperopic direction for younger age groups and in a myopic direction for older subjects, P<0.0001. The gender-adjusted mean change in refractive error in right eyes of persons aged 43 to 54, 55 to 64, 65 to 74, and 75 or older at baseline was +0.43D, +0.46 D, -0.09 D, and -0.23D, respectively. Refractive change was strongly related to baseline nuclear cataract severity; grades 4 to 5 were associated with a myopic shift (-0.38 D, P< 0.0001). The mean age-adjusted change in refraction was +0.27 D for hyperopic eyes, +0.56 D for emmetropic eyes, and +0.26 D for myopic eyes.^ Conclusions. This report has documented refractive error changes in an older population and confirmed reported trends of a hyperopic shift before age 65 and a myopic shift thereafter associated with the development of nuclear cataract.^
Resumo:
Pneumonia is a well-documented and common respiratory infection in patients with acute traumatic spinal cord injuries, and may recur during the course of acute care. Using data from the North American Clinical Trials Network (NACTN) for Spinal Cord Injury, the incidence, timing, and recurrence of pneumonia were analyzed. The two main objectives were (1) to investigate the time and potential risk factors for the first occurrence of pneumonia using the Cox Proportional Hazards model, and (2) to investigate pneumonia recurrence and its risk factors using a Counting Process model that is a generalization of the Cox Proportional Hazards model. The results from survival analysis suggested that surgery, intubation, American Spinal Injury Association (ASIA) grade, direct admission to a NACTN site and age (older than 65 or not) were significant risks for first event of pneumonia and multiple events of pneumonia. The significance of this research is that it has the potential to identify patients at the time of admission who are at high risk for the incidence and recurrence of pneumonia. Knowledge and the time of occurrence of pneumonias are important factors for the development of prevention strategies and may also provide some insights into the selection of emerging therapies that compromise the immune system. ^
Resumo:
Venous thromboembolism (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is the third most preventable cardiovascular disease and a growing public health problem in the United States. The incidence of VTE remains high with an annual estimate of more than 600,000 symptomatic events. DVT affects an estimated 2 million American each year with a death toll of 300,000 persons per year from DVT-related PE. Leukemia patients are at high risk for both hemorrhage and thrombosis; however, little is known about thrombosis among acute leukemia patients. The ultimate goal of this dissertation was to obtain deep understanding of thrombotic issue among acute leukemia patients. The dissertation was presented in a format of three papers. First paper mainly looked at distribution and risk factors associated with development of VTE among patients with acute leukemia prior to leukemia treatment. Second paper looked at incidence, risk factors, and impact of VTE on survival of patients with acute lymphoblastic leukemia during treatment. Third paper looked at recurrence and risk factors for VTE recurrence among acute leukemia patients with an initial episode of VTE. Descriptive statistics, Chi-squared or Fisher's exact test, median test, Mann-Whitney test, logistic regression analysis, Nonparametric Estimation Kaplan-Meier with a log-rank test or Cox model were used when appropriate. Results from analyses indicated that acute leukemia patients had a high prevalence, incidence, and recurrent rate of VTE. Prior history of VTE, obesity, older age, low platelet account, presence of Philadelphia positive ALL, use of oral contraceptives or hormone replacement therapy, presence of malignancies, and co-morbidities may place leukemia patients at an increased risk for VTE development or recurrence. Interestingly, development of VTE was not associated with a higher risk of death among hospitalized acute leukemia patients.^
Resumo:
Trastuzumab is a humanized-monoclonal antibody, developed specifically for HER2-neu over-expressed breast cancer patients. Although highly effective and well tolerated, it was reported associated with Congestive Heart Failure (CHF) in clinical trial settings (up to 27%). This leaves a gap where, Trastuzumab-related CHF rate in general population, especially older breast cancer patients with long term treatment of Trastuzumab remains unknown. This thesis examined the rates and risk factors associated with Trastuzumab-related CHF in a large population of older breast cancer patients. A retrospective cohort study using the existing Surveillance, Epidemiology and End Results (SEER) and Medicare linked de-identified database was performed. Breast cancer patients ≥ 66 years old, stage I-IV, diagnosed in 1998-2007, fully covered by Medicare but no HMO within 1-year before and after first diagnosis month, received 1st chemotherapy no earlier than 30 days prior to diagnosis were selected as study cohort. The primary outcome of this study is a diagnosis of CHF after starting chemotherapy but none CHF claims on or before cancer diagnosis date. ICD-9 and HCPCS codes were used to pool the claims for Trastuzumab use, chemotherapy, comorbidities and CHF claims. Statistical analysis including comparison of characteristics, Kaplan-Meier survival estimates of CHF rates for long term follow up, and Multivariable Cox regression model using Trastuzumab as a time-dependent variable were performed. Out of 17,684 selected cohort, 2,037 (12%) received Trastuzumab. Among them, 35% (714 out of 2037) were diagnosed with CHF, compared to 31% (4784 of 15647) of CHF rate in other chemotherapy recipients (p<.0001). After 10 years of follow-up, 65% of Trastuzumab users developed CHF, compared to 47% in their counterparts. After adjusting for patient demographic, tumor and clinical characteristics, older breast cancer patients who used Trastuzumab showed a significantly higher risk in developing CHF than other chemotherapy recipients (HR 1.69, 95% CI 1.54 - 1.85). And this risk is increased along with the increment of age (p-value < .0001). Among Trastuzumab users, these covariates also significantly increased the risk of CHF: older age, stage IV, Non-Hispanic black race, unmarried, comorbidities, Anthracyclin use, Taxane use, and lower educational level. It is concluded that, Trastuzumab users in older breast cancer patients had 69% higher risk in developing CHF than non-Trastuzumab users, much higher than the 27% increase reported in younger clinical trial patients. Older age, Non-Hispanic black race, unmarried, comorbidity, combined use with Anthracycline or Taxane also significantly increase the risk of CHF development in older patients treated with Trastuzumab. ^
Resumo:
Bisphosphonates represent a unique class of drugs that effectively treat and prevent a variety of bone-related disorders including metastatic bone disease and osteoporosis. High tolerance and high efficacy rates quickly ranked bisphosphonates as the standard of care for bone-related diseases. However, in the early 2000s, case reports began to surface that linked bisphosphonates with osteonecrosis of the jaw (ONJ). Since that time, studies conducted have corroborated the linkage. However, as with most disease states, many factors can contribute to the onset of disease. The aim of this study was to determine which comorbid factors presented an increased risk for developing ONJ in cancer patients.^ Using a case-control study design, investigators used a combination of ICD-9 codes and chart review to identify confirmed cases of ONJ at The University of Texas M. D. Anderson Cancer Center (MDACC). Each case was then matched to five controls based on age, gender, race/ethnicity, and primary cancer diagnosis. Data querying and chart review provided information on variables of interest. These variables included bisphosphonate exposure, glucocorticoids exposure, smoking history, obesity, and diabetes. Statistical analysis was conducted using PASW (Predictive Analytics Software) Statistics, Version 18 (SPSS Inc., Chicago, Illinois).^ One hundred twelve (112) cases were identified as confirmed cases of ONJ. Variables were run using univariate logistic regression to determine significance (p < .05); significant variables were included in the final conditional logistic regression model. Concurrent use of bisphosphonates and glucocorticoids (OR, 18.60; CI, 8.85 to 39.12; p < .001), current smokers (OR, 2.52; CI, 1.21 to 5.25; p = .014), and presence of diabetes (OR, 1.84; CI, 1.06 to 3.20; p = .030) were found to increase the risk for developing ONJ. Obesity was not associated significantly with ONJ development.^ In this study, cancer patients that received bisphosphonates as part of their therapeutic regimen were found to have an 18-fold increase in their risk of developing ONJ. Other factors included smoking and diabetes. More studies examining the concurrent use of glucocorticoids and bisphosphonates may be able to strengthen any correlations.^