107 resultados para mortality analysis
Resumo:
BACKGROUND In patients with cardiogenic shock, data on the comparative safety and efficacy of drug-eluting stents (DESs) vs. bare metal stents (BMSs) are lacking. We sought to assess the performance of DESs compared with BMSs among patients with cardiogenic shock undergoing percutaneous coronary intervention (PCI). METHODS Out of 236 patients with acute coronary syndromes complicated by cardiogenic shock, 203 were included in the final analysis. The primary endpoint included death, and the secondary endpoint of major adverse cardiac and cerebrovascular events (MACCEs) included the composite of death, myocardial infarction, any repeat revascularization and stroke. Patients were followed for a minimum of 30 days and up to 4 years. As stent assignment was not random, we performed a propensity score analysis to minimize potential bias. RESULTS Among patients treated with DESs, there was a lower risk of the primary and secondary endpoints compared with BMSs at 30 days (29 vs. 56%, P < 0.001; 34 vs. 58%, P = 0.001, respectively) and during long-term follow-up [hazard ratio 0.43, 95% confidence interval (CI) 0.29-0.65, P < 0.001; hazard ratio 0.49, 95% CI 0.34-0.71, P < 0.001, respectively]. After propensity score adjustment, all-cause mortality was reduced among patients treated with DESs compared with BMSs both at 30 days [adjusted odds ratio (OR) 0.26, 95% CI 0.11-0.62; P = 0.002] and during long-term follow-up (adjusted hazard ratio 0.40, 95% CI 0.22-0.72; P = 0.002). The rate of MACCE was lower among patients treated with DESs compared with those treated with BMSs at 30 days (adjusted OR 0.42, 95% CI 0.19-0.95; P = 0.036). The difference in MACCEs between devices approached significance during long-term follow-up (adjusted hazard ratio 0.60, 95% CI 0.34-1.01; P = 0.052). CONCLUSION DESs appear to be associated with improved clinical outcomes, including a reduction in all-cause mortality compared with BMSs among patients undergoing PCI for cardiogenic shock, possibly because of a pacification of the infarct-related artery by anti-inflammatory drug. The results of this observational study require confirmation in an appropriately powered randomized trial.
Resumo:
OBJECTIVES This study aimed to update the Logistic Clinical SYNTAX score to predict 3-year survival after percutaneous coronary intervention (PCI) and compare the performance with the SYNTAX score alone. BACKGROUND The SYNTAX score is a well-established angiographic tool to predict long-term outcomes after PCI. The Logistic Clinical SYNTAX score, developed by combining clinical variables with the anatomic SYNTAX score, has been shown to perform better than the SYNTAX score alone in predicting 1-year outcomes after PCI. However, the ability of this score to predict long-term survival is unknown. METHODS Patient-level data (N = 6,304, 399 deaths within 3 years) from 7 contemporary PCI trials were analyzed. We revised the overall risk and the predictor effects in the core model (SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction) using Cox regression analysis to predict mortality at 3 years. We also updated the extended model by combining the core model with additional independent predictors of 3-year mortality (i.e., diabetes mellitus, peripheral vascular disease, and body mass index). RESULTS The revised Logistic Clinical SYNTAX models showed better discriminative ability than the anatomic SYNTAX score for the prediction of 3-year mortality after PCI (c-index: SYNTAX score, 0.61; core model, 0.71; and extended model, 0.73 in a cross-validation procedure). The extended model in particular performed better in differentiating low- and intermediate-risk groups. CONCLUSIONS Risk scores combining clinical characteristics with the anatomic SYNTAX score substantially better predict 3-year mortality than the SYNTAX score alone and should be used for long-term risk stratification of patients undergoing PCI.
Resumo:
OBJECTIVE To investigate the long-term prognostic implications of coronary calcification in patients undergoing percutaneous coronary intervention for obstructive coronary artery disease. METHODS Patient-level data from 6296 patients enrolled in seven clinical drug-eluting stents trials were analysed to identify in angiographic images the presence of severe coronary calcification by an independent academic research organisation (Cardialysis, Rotterdam, The Netherlands). Clinical outcomes at 3-years follow-up including all-cause mortality, death-myocardial infarction (MI), and the composite end-point of all-cause death-MI-any revascularisation were compared between patients with and without severe calcification. RESULTS Severe calcification was detected in 20% of the studied population. Patients with severe lesion calcification were less likely to have undergone complete revascularisation (48% vs 55.6%, p<0.001) and had an increased mortality compared with those without severely calcified arteries (10.8% vs 4.4%, p<0.001). The event rate was also high in patients with severely calcified lesions for the combined end-point death-MI (22.9% vs 10.9%; p<0.001) and death-MI- any revascularisation (31.8% vs 22.4%; p<0.001). On multivariate Cox regression analysis, including the Syntax score, the presence of severe coronary calcification was an independent predictor of poor prognosis (HR: 1.33 95% CI 1.00 to 1.77, p=0.047 for death; 1.23, 95% CI 1.02 to 1.49, p=0.031 for death-MI, and 1.18, 95% CI 1.01 to 1.39, p=0.042 for death-MI- any revascularisation), but it was not associated with an increased risk of stent thrombosis. CONCLUSIONS Patients with severely calcified lesions have worse clinical outcomes compared to those without severe coronary calcification. Severe coronary calcification appears as an independent predictor of worse prognosis, and should be considered as a marker of advanced atherosclerosis.
Resumo:
BACKGROUND Recently, it has been suggested that the type of stent used in primary percutaneous coronary interventions (pPCI) might impact upon the outcomes of patients with acute myocardial infarction (AMI). Indeed, drug-eluting stents (DES) reduce neointimal hyperplasia compared to bare-metal stents (BMS). Moreover, the later generation DES, due to its biocompatible polymer coatings and stent design, allows for greater deliverability, improved endothelial healing and therefore less restenosis and thrombus generation. However, data on the safety and performance of DES in large cohorts of AMI is still limited. AIM To compare the early outcome of DES vs. BMS in AMI patients. METHODS This was a prospective, multicentre analysis containing patients from 64 hospitals in Switzerland with AMI undergoing pPCI between 2005 and 2013. The primary endpoint was in-hospital all-cause death, whereas the secondary endpoint included a composite measure of major adverse cardiac and cerebrovascular events (MACCE) of death, reinfarction, and cerebrovascular event. RESULTS Of 20,464 patients with a primary diagnosis of AMI and enrolled to the AMIS Plus registry, 15,026 were referred for pPCI and 13,442 received stent implantation. 10,094 patients were implanted with DES and 2,260 with BMS. The overall in-hospital mortality was significantly lower in patients with DES compared to those with BMS implantation (2.6% vs. 7.1%,p < 0.001). The overall in-hospital MACCE after DES was similarly lower compared to BMS (3.5% vs. 7.6%, p < 0.001). After adjusting for all confounding covariables, DES remained an independent predictor for lower in-hospital mortality (OR 0.51,95% CI 0.40-0.67, p < 0.001). Since groups differed as regards to baseline characteristics and pharmacological treatment, we performed a propensity score matching (PSM) to limit potential biases. Even after the PSM, DES implantation remained independently associated with a reduced risk of in-hospital mortality (adjusted OR 0.54, 95% CI 0.39-0.76, p < 0.001). CONCLUSIONS In unselected patients from a nationwide, real-world cohort, we found DES, compared to BMS, was associated with lower in-hospital mortality and MACCE. The identification of optimal treatment strategies of patients with AMI needs further randomised evaluation; however, our findings suggest a potential benefit with DES.
Resumo:
CONTEXT Subclinical hypothyroidism has been associated with increased risk of coronary heart disease (CHD), particularly with thyrotropin levels of 10.0 mIU/L or greater. The measurement of thyroid antibodies helps predict the progression to overt hypothyroidism, but it is unclear whether thyroid autoimmunity independently affects CHD risk. OBJECTIVE The objective of the study was to compare the CHD risk of subclinical hypothyroidism with and without thyroid peroxidase antibodies (TPOAbs). DATA SOURCES AND STUDY SELECTION A MEDLINE and EMBASE search from 1950 to 2011 was conducted for prospective cohorts, reporting baseline thyroid function, antibodies, and CHD outcomes. DATA EXTRACTION Individual data of 38 274 participants from six cohorts for CHD mortality followed up for 460 333 person-years and 33 394 participants from four cohorts for CHD events. DATA SYNTHESIS Among 38 274 adults (median age 55 y, 63% women), 1691 (4.4%) had subclinical hypothyroidism, of whom 775 (45.8%) had positive TPOAbs. During follow-up, 1436 participants died of CHD and 3285 had CHD events. Compared with euthyroid individuals, age- and gender-adjusted risks of CHD mortality in subclinical hypothyroidism were similar among individuals with and without TPOAbs [hazard ratio (HR) 1.15, 95% confidence interval (CI) 0.87-1.53 vs HR 1.26, CI 1.01-1.58, P for interaction = .62], as were risks of CHD events (HR 1.16, CI 0.87-1.56 vs HR 1.26, CI 1.02-1.56, P for interaction = .65). Risks of CHD mortality and events increased with higher thyrotropin, but within each stratum, risks did not differ by TPOAb status. CONCLUSIONS CHD risk associated with subclinical hypothyroidism did not differ by TPOAb status, suggesting that biomarkers of thyroid autoimmunity do not add independent prognostic information for CHD outcomes.
Resumo:
BACKGROUND Because computed tomography (CT) has advantages for visualizing the manifestation of necrosis and local complications, a series of scoring systems based on CT manifestations have been developed for assessing the clinical outcomes of acute pancreatitis (AP), including the CT severity index (CTSI), modified CTSI, etc. Despite the internationally accepted CTSI having been successfully used to predict the overall mortality and disease severity of AP, recent literature has revealed the limitations of the CTSI. Using the Delphi method, we establish a new scoring system based on retrocrural space involvement (RCSI), and compared its effectiveness at evaluating the mortality and severity of AP with that of the CTSI. METHODS We reviewed CT images of 257 patients with AP taken within 3-5 days of admission in 2012. The RCSI scoring system, which includes assessment of infectious conditions involving the retrocrural space and the adjacent pleural cavity, was established using the Delphi method. Two radiologists independently assessed the RCSI and CTSI scores. The predictive points of the RCSI and CTSI scoring systems in evaluating the mortality and severity of AP were estimated using receiver operating characteristic (ROC) curves. PRINCIPAL FINDINGS The RCSI score can accurately predict the mortality and disease severity. The area under the ROC curve for the RCSI versus CTSI score was 0.962±0.011 versus 0.900±0.021 for predicting the mortality, and 0.888±0.025 versus 0.904±0.020 for predicting the severity of AP. Applying ROC analysis to our data showed that a RCSI score of 4 was the best cutoff value, above which mortality could be identified. CONCLUSION The Delphi method was innovatively adopted to establish a scoring system to predict the clinical outcome of AP. The RCSI scoring system can predict the mortality of AP better than the CTSI system, and the severity of AP equally as well.
Resumo:
BACKGROUND: Antiviral therapy for the hepatitis C virus (HCV) reduces all-cause and liver-related morbidity and mortality. Few studies are available from populations with multiple medical and psychiatric comorbidities where the impact of successful antiviral therapy might be limited. AIM: The purpose of this study was to determine the effect of sustained virologic response (SVR) on all-cause and liver-related mortality in a cohort of HCV patients treated in an integrated hepatitis/mental health clinic. METHODS: This was a retrospective review of all patients who initiated antiviral treatment for chronic HCV between January 1, 1997 and December 31, 2009. Cox regression analysis was used to determine factors involved in all-cause mortality, liver-related events and hepatocellular carcinoma. RESULTS: A total of 536 patients were included in the analysis. Median follow-up was 7.5 years. Liver and non-liver-related mortality occurred in 2.7 and 5.0 % of patients with SVR and in 17.8 and 6.4 % of patients without SVR. In a multivariate analysis, SVR was the only factor associated with reduced all-cause mortality (HR 0.47; 95 % CI 0.26-0.85; p = 0.012) and reduced liver-related events (HR 0.23; 95 % CI 0.08-0.66, p = 0.007). Having stage 4 liver fibrosis increased all-cause mortality (HR 2.50; 95 % CI 1.23-5.08; p = 0.011). Thrombocytopenia at baseline (HR 2.66; 95 % CI 1.22-5.79; p = 0.014) and stage 4 liver fibrosis (HR 4.87; 95 % CI 1.62-14.53; p = 0.005) increased liver-related events. CONCLUSIONS: Despite significant medical and psychiatric comorbidities, SVR markedly reduced liver-related outcomes without a significant change in non-liver-related mortality after a median follow-up of 7.5 years.
Resumo:
Sequence analysis and optimal matching are useful heuristic tools for the descriptive analysis of heterogeneous individual pathways such as educational careers, job sequences or patterns of family formation. However, to date it remains unclear how to handle the inevitable problems caused by missing values with regard to such analysis. Multiple Imputation (MI) offers a possible solution for this problem but it has not been tested in the context of sequence analysis. Against this background, we contribute to the literature by assessing the potential of MI in the context of sequence analyses using an empirical example. Methodologically, we draw upon the work of Brendan Halpin and extend it to additional types of missing value patterns. Our empirical case is a sequence analysis of panel data with substantial attrition that examines the typical patterns and the persistence of sex segregation in school-to-work transitions in Switzerland. The preliminary results indicate that MI is a valuable methodology for handling missing values due to panel mortality in the context of sequence analysis. MI is especially useful in facilitating a sound interpretation of the resulting sequence types.
Resumo:
BACKGROUND A low or high body mass index (BMI) has been associated with increased mortality risk in older subjects without taking fat mass index (FMI) and fat-free mass index (FFMI) into account. This information is essential because FMI is modulated through different healthcare strategies than is FFMI. OBJECTIVE We aimed to determine the relation between body composition and mortality in older subjects. DESIGN We included all adults ≥65 y old who were living in Switzerland and had a body-composition measurement by bioelectrical impedance analysis at the Geneva University Hospitals between 1990 and 2011. FMI and FFMI were divided into sex-specific quartiles. Quartile 1 (i.e., the reference category) corresponded to the lowest FMI or FFMI quartile. Mortality data were retrieved from the hospital database, the Geneva death register, and the Swiss National Cohort until December 2012. Comorbidities were assessed by using the Cumulative Illness Rating Scale. RESULTS Of 3181 subjects included, 766 women and 1007 men died at a mean age of 82.8 and 78.5 y, respectively. Sex-specific Cox regression models, which were used to adjust for age, BMI, smoking, ambulatory or hospitalized state, and calendar time, showed that body composition did not predict mortality in women irrespective of whether comorbidities were taken into account. In men, risk of mortality was lower with FFMI in quartiles 3 and 4 [HR: 0.78 (95% CI: 0.62, 0.98) and 0.64 (95% CI: 0.49, 0.85), respectively] but was not affected by FMI. When comorbidities were adjusted for, FFMI in quartile 4 (>19.5 kg/m(2)) still predicted a lower risk of mortality (HR: 0.72; 95% CI: 0.54, 0.96). CONCLUSIONS Low FFMI is a stronger predictor of mortality than is BMI in older men but not older women. FMI had no impact on mortality. These results suggest potential benefits of preventive interventions with the aim of maintaining muscle mass in older men. This trial was registered at clinicaltrials.gov as NCT01472679.
Resumo:
BACKGROUND Low bispectral index values frequently reflect EEG suppression and have been associated with postoperative mortality. This study investigated whether intraoperative EEG suppression was an independent predictor of 90 day postoperative mortality and explored risk factors for EEG suppression. METHODS This observational study included 2662 adults enrolled in the B-Unaware or BAG-RECALL trials. A cohort was defined with >5 cumulative minutes of EEG suppression, and 1:2 propensity-matched to a non-suppressed cohort (≤5 min suppression). We evaluated the association between EEG suppression and mortality using multivariable logistic regression, and examined risk factors for EEG suppression using zero-inflated mixed effects analysis. RESULTS Ninety day postoperative mortality was 3.9% overall, 6.3% in the suppressed cohort, and 3.0% in the non-suppressed cohort {odds ratio (OR) [95% confidence interval (CI)]=2.19 (1.48-3.26)}. After matching and multivariable adjustment, EEG suppression was not associated with mortality [OR (95% CI)=0.83 (0.55-1.25)]; however, the interaction between EEG suppression and mean arterial pressure (MAP) <55 mm Hg was [OR (95% CI)=2.96 (1.34-6.52)]. Risk factors for EEG suppression were older age, number of comorbidities, chronic obstructive pulmonary disease, and higher intraoperative doses of benzodiazepines, opioids, or volatile anaesthetics. EEG suppression was less likely in patients with cancer, preoperative alcohol, opioid or benzodiazepine consumption, and intraoperative nitrous oxide exposure. CONCLUSIONS Although EEG suppression was associated with increasing anaesthetic administration and comorbidities, the hypothesis that intraoperative EEG suppression is a predictor of postoperative mortality was only supported if it was coincident with low MAP. CLINICAL TRIAL REGISTRATION NCT00281489 and NCT00682825.
Resumo:
INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.
Resumo:
PURPOSE The impact of cardiopulmonary bypass in level III-IV tumor thrombectomy on surgical and oncologic outcomes is unknown. We determine the impact of cardiopulmonary bypass on overall and cancer specific survival, as well as surgical complication rates and immediate outcomes in patients undergoing nephrectomy and level III-IV tumor thrombectomy with or without cardiopulmonary bypass. MATERIALS AND METHODS We retrospectively analyzed 362 patients with renal cell cancer and with level III or IV tumor thrombus from 1992 to 2012 at 22 U.S. and European centers. Cox proportional hazards models were used to compare overall and cancer specific survival between patients with and without cardiopulmonary bypass. Perioperative mortality and complication rates were assessed using logistic regression analyses. RESULTS Median overall survival was 24.6 months in noncardiopulmonary bypass cases and 26.6 months in cardiopulmonary bypass cases. Overall survival and cancer specific survival did not differ significantly in both groups on univariate analysis or when adjusting for known risk factors. On multivariate analysis no significant differences were seen in hospital length of stay, Clavien 1-4 complication rate, intraoperative or 30-day mortality and cancer specific survival. Limitations include the retrospective nature of the study. CONCLUSIONS In our multi-institutional analysis the use of cardiopulmonary bypass did not significantly impact cancer specific survival or overall survival in patients undergoing nephrectomy and level III or IV tumor thrombectomy. Neither approach was independently associated with increased mortality on multivariate analysis. Greater surgical complications were not independently associated with the use of cardiopulmonary bypass.
Resumo:
Leptospiral pulmonary haemorrhage syndrome (LPHS) is a particularly severe form of leptospirosis. LPHS is increasingly recognized in both humans and animals and is characterized by rapidly progressive intra-alveolar haemorrhage leading to high mortality. The pathogenic mechanisms of LPHS are poorly understood which hampers the application of effective treatment regimes. In this study a 2-D guinea pig proteome lung map was created and used to investigate the pathogenic mechanisms of LPHS. Comparison of lung proteomes from infected and non-infected guinea pigs via differential in-gel electrophoresis revealed highly significant differences in abundance of proteins contained in 130 spots. Acute phase proteins were the largest functional group amongst proteins with increased abundance in LPHS lung tissue, and likely reflect a local and/or systemic host response to infection. The observed decrease in abundance of proteins involved in cytoskeletal and cellular organization in LPHS lung tissue further suggests that infection with pathogenic Leptospira induces changes in the abundance of host proteins involved in cellular architecture and adhesion contributing to the dramatically increased alveolar septal wall permeability seen in LPHS. BIOLOGICAL SIGNIFICANCE The recent completion of the complete genome sequence of the guinea pig (Cavia porcellus) provides innovative opportunities to apply proteomic technologies to an important animal model of disease. In this study, the comparative proteomic analysis of lung tissue from experimentally infected guinea pigs with leptospiral pulmonary haemorrhage syndrome (LPHS) revealed a decrease in abundance of proteins involved in cellular architecture and adhesion, suggesting that loss or down-regulation of cytoskeletal and adhesion molecules plays an important role in the pathogenesis of LPHS. A publically available guinea pig lung proteome map was constructed to facilitate future pulmonary proteomics in this species.
Resumo:
BACKGROUND Sutureless aortic valve replacement (SU-AVR) has emerged as an innovative alternative for treatment of aortic stenosis. By avoiding the placement of sutures, this approach aims to reduce cross-clamp and cardiopulmonary bypass (CPB) duration and thereby improve surgical outcomes and facilitate a minimally invasive approach suitable for higher risk patients. The present systematic review and meta-analysis aims to assess the safety and efficacy of SU-AVR approach in the current literature. METHODS Electronic searches were performed using six databases from their inception to January 2014. Relevant studies utilizing sutureless valves for aortic valve implantation were identified. Data were extracted and analyzed according to predefined clinical endpoints. RESULTS Twelve studies were identified for inclusion of qualitative and quantitative analyses, all of which were observational reports. The minimally invasive approach was used in 40.4% of included patients, while 22.8% underwent concomitant coronary bypass surgery. Pooled cross-clamp and CPB duration for isolated AVR was 56.7 and 46.5 minutes, respectively. Pooled 30-day and 1-year mortality rates were 2.1% and 4.9%, respectively, while the incidences of strokes (1.5%), valve degenerations (0.4%) and paravalvular leaks (PVL) (3.0%) were acceptable. CONCLUSIONS The evaluation of current observational evidence suggests that sutureless aortic valve implantation is a safe procedure associated with shorter cross-clamp and CPB duration, and comparable complication rates to the conventional approach in the short-term.
Resumo:
INTRODUCTION The incidence of cancer increases with age and owing to the changing demographics we are increasingly confronted with treating bladder cancer in old patients. We report our results in patients>75 years of age who underwent open radical cystectomy (RC) and urinary diversion. MATERIAL AND METHODS From January 2000 to March 2013, a consecutive series of 224 old patients with complete follow-up who underwent RC and urinary diversion (ileal orthotopic bladder substitute [OBS], ileal conduit [IC], and ureterocutaneostomy [UCST]) were included in this retrospective single-center study. End points were the 90-day complication rates (Clavien-Dindo classification), 90-day mortality rates, overall and cancer-specific survival rates, and continence rates (OBS). RESULTS Median age was 79.2 years (range: 75.1-91.6); 35 of the 224 patients (17%) received an OBS, 178 of the 224 patients (78%) an IC, and 11 of the 224 patients (5%) an UCST. The 90-day complication rate was 54.3% in the OBS (major: Clavien grade 3-5: 22.9%, minor: Clavien Grade 1-2: 31.4%), 56.7% in the IC (major: 27%, minor: 29.8%), and 63.6% in the UCST group (major: 36.4%, minor: 27.3%); P = 0.001. The 90-day mortality was 0% in the OBS group, 13% in the IC group, and 10% in the UCST group (P = 0.077). The Glasgow prognostic score was an independent predictor of all survival parameters assessed, including 90-day mortality. Median follow-up was 22 months. Overall and cancer-specific survivals were 90 and 98, 47 and 91, and 11 and 12 months for OBS, IC, and UCST, respectively. In OBS patients, daytime continence was considered as dry in 66% and humid in 20% of patients. Nighttime continence was dry in 46% and humid 26% of patients. CONCLUSION With careful patient selection, oncological and functional outcome after RC can be good in old patients. Old age as the sole criterion should not preclude the indication for RC or the option of OBS. In old patients undergoing OBS, satisfactory continence results can be achieved.