950 resultados para Increasing hazard ratio
Resumo:
BACKGROUND Of the approximately 2.4 million American women with a history of breast cancer, 43% are aged ≥ 65 years and are at risk for developing subsequent malignancies. METHODS Women from 6 geographically diverse sites included 5-year breast cancer survivors (N = 1361) who were diagnosed between 1990 and 1994 at age ≥ 65 years with stage I or II disease and a comparison group of women without breast cancer (N = 1361). Women in the comparison group were age-matched and site-matched to breast cancer survivors on the date of breast cancer diagnosis. Follow-up began 5 years after the index date (survivor diagnosis date or comparison enrollment date) until death, disenrollment, or through 15 years after the index date. Data were collected from medical records and electronic sources (cancer registry, administrative, clinical, National Death Index). Analyses included descriptive statistics, crude incidence rates, and Cox proportional hazards regression models for estimating the risk of incident malignancy and were adjusted for death as a competing risk. RESULTS Survivors and women in the comparison group were similar: >82% were white, 55% had a Charlson Comorbidity Index of 0, and ≥ 73% had a body mass index ≤ 30 kg/m(2) . Of all 306 women (N = 160 in the survivor group, N = 146 in the comparison group) who developed a first incident malignancy during follow-up, the mean time to malignancy was similar (4.37 ± 2.81 years vs 4.03 ± 2.76 years, respectively; P = .28), whereas unadjusted incidence rates were slightly higher in survivors (1882 vs 1620 per 100,000 person years). The adjusted hazard of developing a first incident malignancy was slightly elevated in survivors in relation to women in the comparison group, but it was not statistically significant (hazard ratio, 1.17; 95% confidence interval, 0.94-1.47). CONCLUSIONS Older women who survived 5 years after an early stage breast cancer diagnosis were not at an elevated risk for developing subsequent incident malignancies up to 15 years after their breast cancer diagnosis.
Resumo:
BACKGROUND: HCV coinfection remains a major cause of morbidity and mortality among HIV-infected individuals and its incidence has increased dramatically in HIV-infected men who have sex with men(MSM). METHODS: Hepatitis C virus (HCV) coinfection in the Swiss HIV Cohort Study(SHCS) was studied by combining clinical data with HIV-1 pol-sequences from the SHCS Drug Resistance Database(DRDB). We inferred maximum-likelihood phylogenetic trees, determined Swiss HIV-transmission pairs as monophyletic patient pairs, and then considered the distribution of HCV on those pairs. RESULTS: Among the 9748 patients in the SHCS-DRDB with known HCV status, 2768(28%) were HCV-positive. Focusing on subtype B(7644 patients), we identified 1555 potential HIV-1 transmission pairs. There, we found that, even after controlling for transmission group, calendar year, age and sex, the odds for an HCV coinfection were increased by an odds ratio (OR) of 3.2 [95% confidence interval (CI) 2.2, 4.7) if a patient clustered with another HCV-positive case. This strong association persisted if transmission groups of intravenous drug users (IDUs), MSMs and heterosexuals (HETs) were considered separately(in all cases OR >2). Finally we found that HCV incidence was increased by a hazard ratio of 2.1 (1.1, 3.8) for individuals paired with an HCV-positive partner. CONCLUSIONS: Patients whose HIV virus is closely related to the HIV virus of HIV/HCV-coinfected patients have a higher risk for carrying or acquiring HCV themselves. This indicates the occurrence of domestic and sexual HCV transmission and allows the identification of patients with a high HCV-infection risk.
Resumo:
INTRODUCTION Current literature suggesting a higher bleeding risk during combination therapy compared to oral anticoagulation alone is primarily based on retrospective studies or specific populations. We aimed to prospectively evaluate whether unselected medical patients on oral anticoagulation have an increased risk of bleeding when on concomitant antiplatelet therapy. MATERIAL AND METHODS We prospectively studied consecutive adult medical patients who were discharged on oral anticoagulants between 01/2008 and 03/2009 from a Swiss university hospital. The primary outcome was the time to a first major bleed on oral anticoagulation within 12 months, adjusted for age, international normalized ratio target, number of medications, and history of myocardial infarction and major bleeding. RESULTS Among the 515 included anticoagulated patients, the incidence rate of a first major bleed was 8.2 per 100 patient-years. Overall, 161 patients (31.3%) were on both anticoagulant and antiplatelet therapy, and these patients had a similar incidence rate of major bleeding compared to patients on oral anticoagulation alone (7.6 vs. 8.4 per 100 patient-years, P=0.81). In a multivariate analysis, the association of concomitant antiplatelet therapy with the risk of major bleeding was not statistically significant (hazard ratio 0.89, 95% confidence interval, 0.37-2.10). CONCLUSIONS The risk of bleeding in patients receiving oral anticoagulants combined with antiplatelet therapy was similar to patients receiving oral anticoagulants alone, suggesting that the incremental bleeding risk of combination therapy might not be clinically significant.
Resumo:
BACKGROUND There is debate over using tenofovir or zidovudine alongside lamivudine in second-line antiretroviral therapy (ART) following stavudine failure. We analyzed outcomes in cohorts from South Africa, Zambia and Zimbabwe METHODS: Patients aged ≥16 years who switched from a first-line regimen including stavudine to a ritonavir-boosted lopinavir-based second-line regimen with lamivudine or emtricitabine and zidovudine or tenofovir in seven ART programs in southern Africa were included. We estimated the causal effect of receiving tenofovir or zidovudine on mortality and virological failure using Cox proportional hazards marginal structural models. Its parameters were estimated using inverse probability of treatment weights. Baseline characteristics were age, sex, calendar year and country. CD4 cell count, creatinine and hemoglobin levels were included as time-dependent confounders. RESULTS 1,256 patients on second-line ART, including 958 on tenofovir, were analyzed. Patients on tenofovir were more likely to have switched to second-line ART in recent years, spent more time on first-line ART (33 vs. 24 months) and had lower CD4 cell counts (172 vs. 341 cells/μl) at initiation of second-line ART. The adjusted hazard ratio comparing tenofovir with zidovudine was 1.00 (95% confidence interval 0.59-1.68) for virologic failure and 1.40 (0.57-3.41) for death. CONCLUSIONS We did not find any difference in treatment outcomes between patients on tenofovir or zidovudine; however, the precision of our estimates was limited. There is an urgent need for randomized trials to inform second-line ART strategies in resource-limited settings.
Resumo:
AIMS Our aim was to evaluate the invasive haemodynamic indices of high-risk symptomatic patients presenting with 'paradoxical' low-flow, low-gradient, severe aortic stenosis (AS) (PLF-LG) and low-flow, low-gradient severe AS (LEF-LG) and to compare clinical outcomes following transcatheter aortic valve implantation (TAVI) among these challenging AS subgroups. METHODS AND RESULTS Of 534 symptomatic patients undergoing TAVI, 385 had a full pre-procedural right and left heart catheterization. A total of 208 patients had high-gradient severe AS [HGAS; mean gradient (MG) ≥40 mmHg], 85 had PLF-LG [MG ≤ 40 mmHg, indexed aortic valve area [iAVA] ≤0.6 cm(2) m(-2), stroke volume index ≤35 mL/m(2), ejection fraction (EF) ≥50%], and 61 had LEF-LG (MG ≤ 40 mmHg, iAVA ≤0.6 cm(2) m(-2), EF ≤40%). Compared with HGAS, PLF-LG and LEF-LG had higher systemic vascular resistances (HGAS: 1912 ± 654 vs. PLF-LG 2006 ± 586 vs. LEF-LG 2216 ± 765 dyne s m(-5), P = 0.007) but lower valvulo-arterial impedances (HGAS: 7.8 ± 2.7 vs. PLF-LG 6.9 ± 1.9 vs. LEF-LG 7.7 ± 2.5 mmHg mL(-1) m(-2), P = 0.027). At 30 days, no differences in cardiac death (6.5 vs. 4.9 vs. 6.6%, P = 0.90) or death (8.4 vs. 6.1 vs. 6.6%, P = 0.88) were observed among HGAS, PLF-LG, and LEF-LG groups, respectively. At 1 year, New York Heart Association functional improvement occurred in most surviving patients (HGAS: 69.2% vs. PLF-LG 71.7% vs. LEF-LG 89.3%, P = 0.09) and no significant differences in overall mortality were observed (17.6 vs. 20.5 vs. 24.5%, P = 0.67). Compared with HGAS, LEF-LG had a higher 1 year cardiac mortality (adjusted hazard ratio 2.45, 95% confidence interval 1.04-5.75, P = 0.04). CONCLUSION TAVI in PLF-LG or LEF-LG patients is associated with overall mortality rates comparable with HGAS patients and all groups profit symptomatically to a similar extent.
Resumo:
BACKGROUND -The value of standard two-dimensional transthoracic echocardiographic (TTE) parameters for risk stratification in patients with arrhythmogenic right ventricular cardiomyopathy/dysplasia (ARVC/D) is controversial. METHODS AND RESULTS -We investigated the impact of right ventricular fractional area change (FAC) and tricuspid annulus plane systolic excursion (TAPSE) for prediction of major adverse cardiovascular events (MACE) defined as the occurrence of cardiac death, heart transplantation, survived sudden cardiac death, ventricular fibrillation, sustained ventricular tachycardia or arrhythmogenic syncope. Among 70 patients who fulfilled the 2010 ARVC/D Task Force Criteria and underwent baseline TTE, 37 (53%) patients experienced a MACE during a median follow-up period of 5.3 (IQR 1.8-9.8) years. Average values for FAC, TAPSE, and TAPSE indexed to body surface area (BSA) decreased over time (p=0.03 for FAC, p=0.03 for TAPSE and p=0.01 for TAPSE/BSA, each vs. baseline). In contrast, median right ventricular end-diastolic area (RVEDA) increased (p=0.001 vs. baseline). Based on the results of Kaplan-Meier estimates, the time between baseline TTE and experiencing MACE was significantly shorter for patients with FAC <23% (p<0.001), TAPSE <17mm (p=0.02) or right atrial (RA) short axis/BSA ≥25mm/m(2) (p=0.04) at baseline. A reduced FAC constituted the strongest predictor of MACE (hazard ratio 1.08 per 1% decrease; 95% confidence interval 1.04-1.12; p<0.001) on bivariable analysis. CONCLUSIONS -This long-term observational study indicates that TAPSE and dilation of right-sided cardiac chambers are associated with an increased risk for MACE in ARVC/D patients with advanced disease and a high risk for adverse events. However, FAC is the strongest echocardiographic predictor of adverse outcome in these patients. Our data advocate a role for TTE in risk stratification in patients with ARVC/D, although our results may not be generalizable to lower risk ARVC/D cohorts.
Resumo:
The role of the electrophysiologic (EP) study for risk stratification in patients with arrhythmogenic right ventricular cardiomyopathy is controversial. We investigated the role of inducible sustained monomorphic ventricular tachycardia (SMVT) for the prediction of an adverse outcome (AO), defined as the occurrence of cardiac death, heart transplantation, sudden cardiac death, ventricular fibrillation, ventricular tachycardia with hemodynamic compromise or syncope. Of 62 patients who fulfilled the 2010 Arrhythmogenic Right Ventricular Cardiomyopathy Task Force criteria and underwent an EP study, 30 (48%) experienced an adverse outcome during a median follow-up of 9.8 years. SMVT was inducible in 34 patients (55%), 22 (65%) of whom had an adverse outcome. In contrast, in 28 patients without inducible SMVT, 8 (29%) had an adverse outcome. Kaplan-Meier analysis showed an event-free survival benefit for patients without inducible SMVT (log-rank p = 0.008) with a cumulative survival free of an adverse outcome of 72% (95% confidence interval [CI] 56% to 92%) in the group without inducible SMVT compared to 26% (95% CI 14% to 50%) in the other group after 10 years. The inducibility of SMVT during the EP study (hazard ratio [HR] 2.99, 95% CI 1.23 to 7.27), nonadherence (HR 2.74, 95% CI 1.3 to 5.77), and heart failure New York Heart Association functional class II and III (HR 2.25, 95% CI 1.04 to 4.87) were associated with an adverse outcome on univariate Cox regression analysis. The inducibility of SMVT (HR 2.52, 95% CI 1.03 to 6.16, p = 0.043) and nonadherence (HR 2.34, 95% CI 1.1 to 4.99, p = 0.028) remained as significant predictors on multivariate analysis. This long-term observational data suggest that SMVT inducibility during EP study might predict an adverse outcome in patients with arrhythmogenic right ventricular cardiomyopathy, advocating a role for EP study in risk stratification.
Resumo:
INTRODUCTION Data concerning outcome after management of acetabular fractures by anterior approaches with focus on age and fractures associated with roof impaction, central dislocation and/or quadrilateral plate displacement are rare. METHODS Between October 2005 and April 2009 a series of 59 patients (mean age 57 years, range 13-91) with fractures involving the anterior column was treated using the modified Stoppa approach alone or for reduction of displaced iliac wing or low anterior column fractures in combination with the 1st window of the ilioinguinal approach or the modified Smith-Petersen approach, respectively. Surgical data, accuracy of reduction, clinical and radiographic outcome at mid-term and the need for endoprosthetic replacement in the postoperative course (defined as failure) were assessed; uni- and multivariate regression analysis were performed to identify independent predictive factors (e.g. age, nonanatomical reduction, acetabular roof impaction, central dislocation, quadrilateral plate displacement) for a failure. Outcome was assessed for all patients in general and in accordance to age in particular; patients were subdivided into two groups according to their age (group "<60yrs", group "≥60yrs"). RESULTS Forty-three of 59 patients (mean age 54yrs, 13-89) were available for evaluation. Of these, anatomic reduction was achieved in 72% of cases. Nonanatomical reduction was identified as being the only multivariate predictor for subsequent total hip replacement (Adjusted Hazard Ratio 23.5; p<0.01). A statistically significant higher rate of nonanatomical reduction was observed in the presence of acetabular roof impaction (p=0.01). In 16% of all patients, total hip replacement was performed and in 69% of patients with preserved hips the clinical results were excellent or good at a mean follow up of 35±10 months (range: 24-55). No statistical significant differences were observed between both groups. CONCLUSION Nonanatomical reconstruction of the articular surfaces is at risk for failure of joint-preserving management of acetabular fractures through an isolated or combined modified Stoppa approach resulting in total joint replacement at mid-term. In the elderly, joint-preserving surgery is worth considering as promising clinical and radiographic results might be obtained at mid-term.
Resumo:
BACKGROUND Marfan syndrome (MFS) is a variable, autosomal-dominant disorder of the connective tissue. In MFS serious ventricular arrhythmias and sudden cardiac death (SCD) can occur. The aim of this prospective study was to reveal underlying risk factors and to prospectively investigate the association between MFS and SCD in a long-term follow-up. METHODS 77 patients with MFS were included. At baseline serum N-terminal pro-brain natriuretic peptide (NT-proBNP), transthoracic echocardiogram, 12-lead resting ECG, signal-averaged ECG (SAECG) and a 24-h Holter ECG with time- and frequency domain analyses were performed. The primary composite endpoint was defined as SCD, ventricular tachycardia (VT), ventricular fibrillation (VF) or arrhythmogenic syncope. RESULTS The median follow-up (FU) time was 868 days. Among all risk stratification parameters, NT-proBNP remained the exclusive predictor (hazard ratio [HR]: 2.34, 95% confidence interval [CI]: 1.1 to 4.62, p=0.01) for the composite endpoint. With an optimal cut-off point at 214.3 pg/ml NT-proBNP predicted the composite primary endpoint accurately (AUC 0.936, p=0.00046, sensitivity 100%, specificity 79.0%). During FU, seven patients of Group 2 (NT-proBNP ≥ 214.3 pg/ml) reached the composite endpoint and 2 of these patients died due to SCD. In five patients, sustained VT was documented. All patients with a NT-proBNP<214.3 pg/ml (Group 1) experienced no events. Group 2 patients had a significantly higher risk of experiencing the composite endpoint (logrank-test, p<0.001). CONCLUSIONS In contrast to non-invasive electrocardiographic parameter, NT-proBNP independently predicts adverse arrhythmogenic events in patients with MFS.
Resumo:
The objective of the study was to determine if there are sex-based differences in the prevalence and clinical outcomes of subclinical peripheral artery disease (PAD). We evaluated the sex-specific associations of ankle-brachial index (ABI) with clinical cardiovascular disease outcomes in 2797 participants without prevalent clinical PAD and with a baseline ABI measurement in the Health, Aging, and Body Composition study. The mean age was 74 years, 40% were black, and 52% were women. Median follow-up was 9.37 years. Women had a similar prevalence of ABI < 0.9 (12% women versus 11% men; P = 0.44), but a higher prevalence of ABI 0.9-1.0 (15% versus 10%, respectively; P < 0.001). In a fully adjusted model, ABI < 0.9 was significantly associated with higher coronary heart disease (CHD) mortality, incident clinical PAD and incident myocardial infarction in both women and men. ABI < 0.9 was significantly associated with incident stroke only in women. ABI 0.9-1.0 was significantly associated with CHD death in both women (hazard ratio 4.84, 1.53-15.31) and men (3.49, 1.39-8.72). However, ABI 0.9-1.0 was significantly associated with incident clinical PAD (3.33, 1.44-7.70) and incident stroke (2.45, 1.38-4.35) only in women. Subclinical PAD was strongly associated with adverse CV events in both women and men, but women had a higher prevalence of subclinical PAD.
Resumo:
Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.
Resumo:
Reporting and publication bias is a well-known problem in meta-analysis and healthcare research. In 2002 we conducted a meta-analysis on the effects of erythropoiesis-stimulating agents (ESAs) on overall survival in cancer patients, which suggested some evidence for improved survival in patients receiving ESAs compared with controls. However, a meta-analysis of individual patient data conducted several years later showed the opposite of our first meta-analysis, that is, evidence for increased on-study mortality and reduced overall survival in cancer patients receiving ESAs. We aimed to determine whether the results of our first meta-analysis could have been affected by publication and reporting biases and, if so, whether timely access to clinical study reports and individual patient data could have prevented this. We conducted a hypothetical meta-analysis for overall survival including all studies and study data that could have been available in 2002, at the time when we conducted our first meta-analysis. Compared with our original meta-analysis, which suggested an overall survival benefit for cancer patients receiving ESAs [hazard ratio (HR) 0.81, 95% confidence interval (CI) 0.67‒0.99], our hypothetical meta-analysis based on the results of all studies conducted at the time of the first analysis did not show evidence for a beneficial effect of ESAs on overall survival (HR 0.97, 95% CI 0.83‒1.12). Thus we have to conclude that our first meta-analysis showed misleading overall survival benefits due to publication and reporting biases, which could have been prevented by timely access to clinical study reports and individual patient data. Unrestricted access to clinical study protocols including amendments, clinical study reports and individual patient data is needed to ensure timely detection of both beneficial and harmful effects of healthcare interventions.
Resumo:
BACKGROUND Drinking eight glasses of fluid or water each day is widely believed to improve health, but evidence is sparse and conflicting. We aimed to investigate the association between fluid consumption and long-term mortality and kidney function. METHODS We conducted a longitudinal analysis within a prospective, population-based cohort study of 3858 men and women aged 49 years or older residing in Australia. Daily fluid intake from food and beverages not including water was measured using a food frequency questionnaire. We did multivariable adjusted Cox proportional hazard models for all-cause and cardiovascular mortality and a boot-strapping procedure for estimated glomerular filtration rate (eGFR). RESULTS Upper and lower quartiles of daily fluid intake corresponded to >3 L and <2 L, respectively. During a median follow-up of 13.1 years (total 43 093 years at risk), 1127 deaths (26.1 per 1000 years at risk) including 580 cardiovascular deaths (13.5 per 1000 years at risk) occurred. Daily fluid intake (per 250 mL increase) was not associated with all-cause [adjusted hazard ratio (HR) 0.99 (95% CI 0.98-1.01)] or cardiovascular mortality [HR 0.98 (95% CI 0.95-1.01)]. Overall, eGFR reduced by 2.2 mL/min per 1.73 m(2) (SD 10.9) in the 1207 (31%) participants who had repeat creatinine measurements and this was not associated with fluid intake [adjusted regression coefficient 0.06 mL/min/1.73 m(2) per 250 mL increase (95% CI -0.03 to 0.14)]. CONCLUSIONS Fluid intake from food and beverages excluding water is not associated with improved kidney function or reduced mortality.
Resumo:
BACKGROUND Patients with isolated locoregional recurrences (ILRR) of breast cancer have a high risk of distant metastasis and death from breast cancer. We aimed to establish whether adjuvant chemotherapy improves the outcome of such patients. METHODS The CALOR trial was a pragmatic, open-label, randomised trial that accrued patients with histologically proven and completely excised ILRR after unilateral breast cancer who had undergone a mastectomy or lumpectomy with clear surgical margins. Eligible patients were enrolled from hospitals worldwide and were centrally randomised (1:1) to chemotherapy (type selected by the investigator; multidrug for at least four courses recommended) or no chemotherapy, using permuted blocks, and stratified by previous chemotherapy, oestrogen-receptor and progesterone-receptor status, and location of ILRR. Patients with oestrogen-receptor-positive ILRR received adjuvant endocrine therapy, radiation therapy was mandated for patients with microscopically involved surgical margins, and anti-HER2 therapy was optional. The primary endpoint was disease-free survival. All analyses were by intention to treat. This study is registered with ClinicalTrials.gov, number NCT00074152. FINDINGS From Aug 22, 2003, to Jan 31, 2010, 85 patients were randomly assigned to receive chemotherapy and 77 were assigned to no chemotherapy. At a median follow-up of 4·9 years (IQR 3·6-6 ·0), 24 (28%) patients had disease-free survival events in the chemotherapy group compared with 34 (44%) in the no chemotherapy group. 5-year disease-free survival was 69% (95% CI 56-79) with chemotherapy versus 57% (44-67) without chemotherapy (hazard ratio 0·59 [95% CI 0·35-0·99]; p=0·046). Adjuvant chemotherapy was significantly more effective for women with oestrogen-receptor-negative ILRR (pinteraction=0·046), but analyses of disease-free survival according to the oestrogen-receptor status of the primary tumour were not statistically significant (pinteraction=0·43). Of the 81 patients who received chemotherapy, 12 (15%) had serious adverse events. The most common adverse events were neutropenia, febrile neutropenia, and intestinal infection. INTERPRETATION Adjuvant chemotherapy should be recommended for patients with completely resected ILRR of breast cancer, especially if the recurrence is oestrogen-receptor negative. FUNDING US Department of Health and Human Services, Swiss Group for Clinical Cancer Research (SAKK), Frontier Science and Technology Research Foundation, Australian and New Zealand Breast Cancer Trials Group, Swedish Cancer Society, Oncosuisse, Cancer Association of South Africa, Foundation for Clinical Research of Eastern Switzerland (OSKK), Grupo Español de Investigación en Cáncer de Mama (GEICAM), and the Dutch Breast Cancer Trialists' Group (BOOG).
Resumo:
BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.