107 resultados para Modèle de Cox pondéré
Resumo:
Aim of the study: The aerial parts of Baccharis dracunculifolia D.C., popularly known as ""alecrim do campo"" are used in folk medicine as anti-inflammatory. The aim of the present study was to evaluate the anti-inflammatory and antinociceptive activities of the crude hydroalcoholic extract obtained from leaves of Baccharis dracunculifolia (BdE), which have not been reported. Matetials and methods: BdE was analyzed by HPLC and in vivo evaluated (doses ranging from 50 to 400 mg/kg, p.o.) by using the acetic acid-induced abdominal constrictions, paw oedema induced by carrageenan or histamine, overt nociception models using capsaicin, glutamate or phorbol myristate acetate (PMA), formalin-induced nociception and mechanical hypernociception induced by carrageenan or complete Freund adjuvant (CFA). As positive controls it was used paracetamol in both acetic acid and formalin tests; dipyrone in capsaicin, glutamate and PMA-induced nociception; indomethacin in CFA and carrageenan-induced hypernociception models. In addition, the in vitro effects of BdE on COX-2 activity and on the activation of NF-kappa B were also evaluated. Results: BdE (50-400 mg/kg, p.o.) significantly diminished the abdominal constrictions induced by acetic acid, glutamate and CFA. Furthermore, BdE also inhibited the nociceptive responses in both phases of formalin-induced nociception. BdE, administered orally, also produced a long-lasting anti-hypernociceptive effect in the acute model of inflammatory pain induced by carrageenan. It was also observed the inhibition of COX-2 activity by BdE. Conclusion: In summary, the data reported in this work confirmed the traditional anti-inflammatory indications of Baccharis dracunculifolia leaves and provided biological evidences that Baccharis dracunculifolia, like Brazilian green propolis, possess antinociceptive and anti-inflammatory activities. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
The aim of this study was to define the immunoregulatory role of prostaglandins in a mouse model of Strongyloides venezuelensis infection. Strongyloides venezuelensis induced an increase of eosinophils and mononuclear cells in the blood, peritoneal cavity fluid, and bronchoalveolar lavage fluid. Treatment with the dual cyclooxygenase (COX-1/-2) inhibitors indomethacin and ibuprofen, and the COX-2-selective inhibitor celecoxib partially blocked these cellular responses and was associated with enhanced numbers of infective larvae in the lung and adult worms in the duodenum. However, the drugs did not interfere with worm fertility. Cyclooxygenase inhibitors also inhibited the production of the T-helper type 2 (Th2) mediators IL-5, IgG1, and IgE, while indomethacin alone also inhibited IL-4, IL-10, and IgG2a. Cyclooxygenase inhibitors tended to enhance the Th1 mediators IL-12 and IFN-gamma. This shift away from Th2 immunity in cyclooxygenase inhibitor-treated mice correlated with reduced prostaglandin E(2) (PGE(2)) production in infected duodenal tissue. As PGE(2) is a well-characterized driver of Th2 immunity, we speculate that reduced production of this lipid might be involved in the shift toward a Th1 phenotype, favoring parasitism by S. venezuelensis. These findings provide new evidence that cyclooxygenase-derived lipids play a role in regulating host defenses against Strongyloides, and support the exploration of eicosanoid signaling for identifying novel preventive and therapeutic modalities against these infections.
Resumo:
P>The aim of this study was to evaluate a possible synergism between melatonin and meloxicam in up-regulating the immune response in male Wistar rats infected with Trypanosoma cruzi during immunosuppression phenomenon, which characterizes the acute phase of the Chagas` disease. Male Wistar rats were infected with the Y strain of T. cruzi. Experiments were performed on 7, 14 and 21 days post-infection. Several immunological parameters were evaluated including gamma-interferon (IFN-gamma), interleukin-2 (IL-2), nitric oxide (NO) and prostaglandin E(2) (PGE(2)). The combined treatment with melatonin and meloxicam significantly enhanced the release of IL-2 and INF-gamma into animals` serum, when compared with the infected control groups during the course of infection. Furthermore, the blockade of PGE(2) synthesis and the increased release of NO by macrophage cells from T. cruzi-infected animals contributed to regulate the production of Th1 subset cytokines significantly reducing the parasitaemia in animals treated with the combination of both substances. Therefore, our results suggest that the association of melatonin and meloxicam was more effective in protecting animals against the harmful actions of T. cruzi infection as compared with the treatments of meloxicam or melatonin alone.
Resumo:
Objectives We sought to determine whether the quantitative assessment of myocardial fibrosis (MF), either by histopathology or by contrast-enhanced magnetic resonance imaging (ce-MRI), could help predict long-term survival after aortic valve replacement. Background Severe aortic valve disease is characterized by progressive accumulation of interstitial MF. Methods Fifty-four patients scheduled to undergo aortic valve replacement were examined by ce-MRI. Delayed-enhanced images were used for the quantitative assessment of MF. In addition, interstitial MF was quantified by histological analysis of myocardial samples obtained during open-heart surgery and stained with picrosirius red. The ce-MRI study was repeated 27 +/- 22 months after surgery to assess left ventricular functional improvement, and all patients were followed for 52 +/- 17 months to evaluate long-term survival. Results There was a good correlation between the amount of MF measured by histopathology and by ce-MRI (r = 0.69, p < 0.001). In addition, the amount of MF demonstrated a significant inverse correlation with the degree of left ventricular functional improvement after surgery (r = -0.42, p = 0.04 for histopathology; r = -0.47, p = 0.02 for ce-MRI). Kaplan-Meier analyses revealed that higher degrees of MF accumulation were associated with worse long-term survival (chi-square = 6.32, p = 0.01 for histopathology; chi-square = 5.85, p = 0.02 for ce-MRI). On multivariate Cox regression analyses, patient age and the amount of MF were found to be independent predictors of all-cause mortality. Conclusions The amount of MF, either by histopathology or by ce-MRI, is associated with the degree of left ventricular functional improvement and all-cause mortality late after aortic valve replacement in patients with severe aortic valve disease. (J Am Coll Cardiol 2010; 56: 278-87) (c) 2010 by the American College of Cardiology Foundation
Resumo:
Background: The presence of coronary artery calcium (CAC) is an independent marker of increased risk of cardiovascular disease (CVD) events and mortality. However, the predictive value of thoracic aorta calcification (TAC), which can be additionally identified without further scanning during assessment of CAC, is unknown. Methods: We followed a cohort of 8401 asymptomatic individuals (mean age: 53 +/- 10 years, 69% men) undergoing cardiac risk factor evaluation and TAC and CAC testing with electron beam computed tomography. Multivariable Cox proportional hazards models were developed to predict all-cause mortality based on the presence of TAC. Results: During a median follow-up period of 5 years, 124 (1.5%) deaths were observed. Overall survival was 96.9% and 98.9% for those with and without detectable TAC, respectively (p < 0.0001). Compared to those with no TAC, the hazard ratio for mortality in the presence of TAC was 3.25 (95% CI: 2.28-4.65, p < 0.0001) in unadjusted analysis. After adjusting for age, gender, hypertension, dyslipidemia, diabetes mellitus, smoking and family history of premature coronary artery disease, and presence of CAC the relationship remained robust (HR 1.61, 95% CI: 1.10-2.27, p = 0.015). Likelihood ratio chi(2) statistics demonstrated that the addition of TAC contributed significantly in predicting mortality to traditional risk factors alone (chi(2) = 13.62, p = 0.002) as well as risk factors + CAC (chi(2) = 5.84, p = 0.02) models. Conclusion: In conclusion, the presence of TAC was associated with all-cause mortality in our study; this relationship was independent of conventional CVD risk factors as well as the presence of CAC. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Resumo:
Background: The accuracy of multidetector computed tomographic (CT) angiography involving 64 detectors has not been well established. Methods: We conducted a multicenter study to examine the accuracy of 64-row, 0.5-mm multidetector CT angiography as compared with conventional coronary angiography in patients with suspected coronary artery disease. Nine centers enrolled patients who underwent calcium scoring and multidetector CT angiography before conventional coronary angiography. In 291 patients with calcium scores of 600 or less, segments 1.5 mm or more in diameter were analyzed by means of CT and conventional angiography at independent core laboratories. Stenoses of 50% or more were considered obstructive. The area under the receiver-operating-characteristic curve (AUC) was used to evaluate diagnostic accuracy relative to that of conventional angiography and subsequent revascularization status, whereas disease severity was assessed with the use of the modified Duke Coronary Artery Disease Index. Results: A total of 56% of patients had obstructive coronary artery disease. The patient-based diagnostic accuracy of quantitative CT angiography for detecting or ruling out stenoses of 50% or more according to conventional angiography revealed an AUC of 0.93 (95% confidence interval [CI], 0.90 to 0.96), with a sensitivity of 85% (95% CI, 79 to 90), a specificity of 90% (95% CI, 83 to 94), a positive predictive value of 91% (95% CI, 86 to 95), and a negative predictive value of 83% (95% CI, 75 to 89). CT angiography was similar to conventional angiography in its ability to identify patients who subsequently underwent revascularization: the AUC was 0.84 (95% CI, 0.79 to 0.88) for multidetector CT angiography and 0.82 (95% CI, 0.77 to 0.86) for conventional angiography. A per-vessel analysis of 866 vessels yielded an AUC of 0.91 (95% CI, 0.88 to 0.93). Disease severity ascertained by CT and conventional angiography was well correlated (r=0.81; 95% CI, 0.76 to 0.84). Two patients had important reactions to contrast medium after CT angiography. Conclusions: Multidetector CT angiography accurately identifies the presence and severity of obstructive coronary artery disease and subsequent revascularization in symptomatic patients. The negative and positive predictive values indicate that multidetector CT angiography cannot replace conventional coronary angiography at present. (ClinicalTrials.gov number, NCT00738218.).
Resumo:
Proteinuria was associated with cardiovascular events and mortality in community-based cohorts. The association of proteinuria with mortality and cardiovascular events in patients undergoing percutaneous coronary intervention (PCI) was unknown. The association of urinary dipstick proteinuria with mortality and cardiovascular events (composite of death, myocardial infarction, or nonhemorrhagic stroke) in 5,835 subjects of the EXCITE trial was evaluated. Dipstick urinalysis was performed before PCI, and proteinuria was defined as trace or greater. Subjects were followed up for 210 days/7 months after enrollment for the occurrence of events. Multivariate Cox regression analysis evaluated the independent association of proteinuria with each outcome. Mean age was 59 years, 21% were women, 18% had diabetes mellitus, and mean estimated glomerular filtration rate was 90 ml/min/1.73 m(2). Proteinuria was present in 750 patients (13%). During follow-up, 22 subjects (2.9%) with proteinuria and 54 subjects (1.1%) without proteinuria died (adjusted hazard ratio 2.83, 95% confidence interval [CI] 1.65 to 4.84, p <0.001). The severity of proteinuria attenuated the strength of the association with mortality after PCI (low-grade proteinuria, hazard ratio 2.67, 95% CI 1.50 to 4.75; high-grade proteinuria, hazard ratio 3.76, 95% CI 1.24 to 11.37). No significant association was present for cardiovascular events during the relatively short follow-up, but high-grade proteinuria tended toward increased risk of cardiovascular events (hazard ratio 1.45, 95% CI 0.81 to 2.61). In conclusion, proteinuria was strongly and independently associated with mortality in patients undergoing PCI. These data suggest that such a relatively simple and clinically easy to use tool as urinary dipstick may be useful to identify and treat patients at high risk of mortality at the time of PCI. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1151-1155)
Resumo:
Purpose: To evaluate the influence of cross-sectional arc calcification on the diagnostic accuracy of computed tomography (CT) angiography compared with conventional coronary angiography for the detection of obstructive coronary artery disease (CAD). Materials and Methods: Institutional Review Board approval and written informed consent were obtained from all centers and participants for this HIPAA-compliant study. Overall, 4511 segments from 371 symptomatic patients (279 men, 92 women; median age, 61 years [interquartile range, 53-67 years]) with clinical suspicion of CAD from the CORE-64 multi-center study were included in the analysis. Two independent blinded observers evaluated the percentage of diameter stenosis and the circumferential extent of calcium (arc calcium). The accuracy of quantitative multidetector CT angiography to depict substantial (>50%) stenoses was assessed by using quantitative coronary angiography (QCA). Cross-sectional arc calcium was rated on a segment level as follows: noncalcified or mild (<90 degrees), moderate (90 degrees-180 degrees), or severe (>180 degrees) calcification. Univariable and multivariable logistic regression, receiver operation characteristic curve, and clustering methods were used for statistical analyses. Results: A total of 1099 segments had mild calcification, 503 had moderate calcification, 338 had severe calcification, and 2571 segments were noncalcified. Calcified segments were highly associated (P < .001) with disagreement between CTA and QCA in multivariable analysis after controlling for sex, age, heart rate, and image quality. The prevalence of CAD was 5.4% in noncalcified segments, 15.0% in mildly calcified segments, 27.0% in moderately calcified segments, and 43.0% in severely calcified segments. A significant difference was found in area under the receiver operating characteristic curves (noncalcified: 0.86, mildly calcified: 0.85, moderately calcified: 0.82, severely calcified: 0.81; P < .05). Conclusion: In a symptomatic patient population, segment-based coronary artery calcification significantly decreased agreement between multidetector CT angiography and QCA to detect a coronary stenosis of at least 50%.
Resumo:
Aortic valve calcium (AVC) can be quantified on the same computed tomographic scan as coronary artery calcium (CAC). Although CAC is an established predictor of cardiovascular events, limited evidence is available for an independent predictive value for AVC. We studied a cohort of 8,401 asymptomatic subjects (mean age 53 10 years, 69% men), who were free of known coronary heart disease and were undergoing electron beam computed tomography for assessment of subclinical atherosclerosis. The patients were followed for a median of 5 years (range 1 to 7) for the occurrence of mortality from any cause. Multivariate Cox regression models were developed to predict all-cause mortality according to the presence of AVC. A total of 517 patients (6%) had AVC on electron beam computed tomography. During follow-up, 124 patients died (1.5%), for an overall survival rate of 96.1% and 98.7% for those with and without AVC, respectively (hazard ratio 3.39, 95% confidence interval 2.09 to 5.49). After adjustment for age, gender, hypertension, dyslipidemia, diabetes mellitus, smoking, and a family history of premature coronary heart disease, AVC remained a significant predictor of mortality (hazard ratio 1.82, 95% confidence interval 1.11 to 2.98). Likelihood ratio chi-square statistics demonstrated that the addition of AVC contributed significantly to the prediction of mortality in a model adjusted for traditional risk factors (chi-square = 5.03, p = 0.03) as well as traditional risk factors plus the presence of CAC (chi-square = 3.58, p = 0.05). In conclusion, AVC was associated with increased all-cause mortality, independent of the traditional risk factors and the presence of CAC. (C) 2010 Published by Elsevier Inc. (Am J Cardiol 2010;106:1787-1791)
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Background and objectives Fibroblast growth factor 23 (FGF-23) has emerged as a new factor in mineral metabolism in chronic kidney disease (CKD). An important regulator of phosphorus homeostasis, FGF-23 has been shown to independently predict CKD progression in nondiabetic renal disease. We analyzed the relation between FGF-23 and renal outcome in diabetic nephropathy (DN). Design, setting, participants, & measurements DN patients participating in a clinical trial (enalapril+placebo versus enalapril+losartan) had baseline data collected and were followed until June 2009 or until the primary outcome was reached. Four patients were lost to follow-up. The composite primary outcome was defined as death, doubling of serum creatinine, and/or dialysis need. Results At baseline, serum FGF-23 showed a significant association with serum creatinine, intact parathyroid hormone, proteirturia, urinary fractional excretion of phosphate, male sex, and race. Interestingly, FGF-23 was not related to calcium, phosphorus, 25OH-vitamin D, or 24-hour urinary phosphorus. Mean follow-up time was 30.7 +/- 10 months. Cox regression showed that FGF-23 was an independent predictor of the primary outcome, even after adjustment for creatinine clearance and intact parathyroid hormone (10 pg/ml FGF-23 increase = hazard ratio, 1.09; 95% CI, 1.01 to 1.16, P = 0.02). Finally, Kaplan-Meier analysis showed a significantly higher risk of the primary outcome in patients with FGF-23 values of >70 pg/ml. Conclusions FGF-23 is a significant independent predictor of renal outcome in patients with macroalbuminuric DN. Further studies should clarify whether this relation is causal and whether FGF-23 should be a new therapeutic target for CKD prevention. Clin J Am Soc Nephrol 6: 241-247, 2011. doi: 10.2215/CJN.04250510
Resumo:
Objective. To evaluate the beneficial effect of antimalarial treatment on lupus survival in a large, multiethnic, international longitudinal inception cohort. Methods. Socioeconomic and demographic characteristics, clinical manifestations, classification criteria, laboratory findings, and treatment variables were examined in patients with systemic lupus erythematosus (SLE) from the Grupo Latino Americano de Estudio del Lupus Eritematoso (GLADEL) cohort. The diagnosis of SLE, according to the American College of Rheumatology criteria, was assessed within 2 years of cohort entry. Cause of death was classified as active disease, infection, cardiovascular complications, thrombosis, malignancy, or other cause. Patients were subdivided by antimalarial use, grouped according to those who had received antimalarial drugs for at least 6 consecutive months (user) and those who had received antimalarial drugs for <6 consecutive months or who had never received antimalarial drugs (nonuser). Results. Of the 1,480 patients included in the GLADEL cohort, 1,141 (77%) were considered antimalarial users, with a mean duration of drug exposure of 48.5 months (range 6-98 months). Death occurred in 89 patients (6.0%). A lower mortality rate was observed in antimalarial users compared with nonusers (4.4% versus 11.5%; P < 0.001). Seventy patients (6.1%) had received antimalarial drugs for 6-11 months, 146 (12.8%) for 1-2 years, and 925 (81.1%) for >2 years. Mortality rates among users by duration of antimalarial treatment (per 1,000 person-months of followup) were 3.85 (95% confidence interval [95% CI] 1.41-8.37), 2.7 (95% CI 1.41-4.76), and 0.54 (95% CI 0.37-0.77), respectively, while for nonusers, the mortality rate was 3.07 (95% CI 2.18-4.20) (P for trend < 0.001). After adjustment for potential confounders in a Cox regression model, antimalarial use was associated with a 38% reduction in the mortality rate (hazard ratio 0.62, 95% CI 0.39-0.99). Conclusion. Antimalarial drugs were shown to have a protective effect, possibly in a time-dependent manner, on SLE survival. These results suggest that the use of antimalarial treatment should be recommended for patients with lupus.
Resumo:
Drug provocation tests (DPTs) are considered the gold standard for identifying adverse drug reactions (ADRs). The aim of this study was to analyze DPT results and discuss severe systemic reactions associated with them. This was a retrospective analysis of 500 patients with ADRs who sought treatment and were submitted to DPTs when indicated between 2006 and 2010. We performed DPTs according to the European Network for Drug Allergy recommendations. Single-blind, placebo-controlled DPTs were performed with antibiotics, local anesthetics, and nonsteroidal anti-inflammatory drugs, as well as with other drugs. Patient characteristics, DPT results, and reactions were analyzed. The sample comprised 198 patients (80.8% of whom were female patients) submitted to 243 DPTs. Ages ranged from 9 to 84 years (mean, 39.9 years). The 243 DPTs were performed with local anesthetics (n = 93), antibiotics (n = 19), acetaminophen (n = 44), benzydamine (n = 33), COX-2 inhibitors (n = 26), dipyrone (n = 7), aspirin (n = 4), or other drugs (n = 17). The results of 4 tests (1.6%) were inconclusive, whereas those of 10 (4.1%) revealed positive reactions to antibiotics (2/19), COX-2 inhibitors (2/26), acetaminophen (3/44), and local anesthetics (3/93). Two severe reactions were observed: cephalexin-induced anaphylactic shock and bupivacaine-induced anaphylaxis without shock. Four patients (2.0%) reacted to the placebo before administration of the drug. Drug provocation tests are safe for use in clinical practice but they should be placebo-controlled and should be performed under the supervision of an allergist. To confirm a presumptive diagnosis and to manage allergies appropriately, it is crucial to perform DPTs. (Allergy Asthma Proc 32:301-306, 2011; doi: 10.2500/aap.2011.32.3450)
Resumo:
Objective: To evaluate whether the number of vessels disease has an impact on clinical outcomes as well as on therapeutic results accordingly to medical, percutaneous, or surgery treatment in chronic coronary artery disease. Methods: We evaluated 825 individuals enrolled in MASS study, a randomized study to compare treatment options for single or multivessel coronary artery disease with preserved left ventricular function, prospectively followed during 5 years. The incidence of overall mortality and the composite end-point of death, myocardial infarction, and refractory angina were compared in three groups: single vessel disease (SVD n = 214), two-vessel disease (2VD n = 253) and three-vessel disease (3VD n = 358). The relationship between baseline variables and the composite end-point was assessed using a Cox proportional hazards survival model. Results: Most baseline characteristics were similar among groups, except age (younger in SVD and older in 3VD, p < 0.001), lower incidence of hypertension in SVD (p < 0.0001), and lower levels of total and LDL-cholesterol in 3VD (p = 0.004 and p = 0.005, respectively). There were no statistical differences in composite end-point in 5 years among groups independent of the kind of treatment; however, there was a higher mortality rate in 3VD (p < 0.001). When we stratified our analysis for each treatment option, bypass surgery was associated with a tower number of composite end-point in all groups (SVD p < 0.001, 2VD p = 0.002, 3VD p < 0.001). In multivariate analysis, we found higher mortality risk in 3VD comparing to SVD (p = 0.005, HR 3.14, 95%Cl 1.4-7.0). Conclusion: Three-vessel disease was associated with worse prognosis compared to single-or two-vessel disease in patients with stable coronary disease and preserved ventricular function at 5-year follow-up. In addition, event-free survival rates were higher after bypass surgery, independent of the number of vessels diseased in these subsets of patients. (c) 2008 European Association for Cardio-Thoracic Surgery. Published by Elsevier B.V. All rights reserved.