942 resultados para Modal interval analysis
Resumo:
BACKGROUND Ultrathin strut biodegradable polymer sirolimus-eluting stents (BP-SES) proved noninferior to durable polymer everolimus-eluting stents (DP-EES) for a composite clinical end point in a population with minimal exclusion criteria. We performed a prespecified subgroup analysis of the Ultrathin Strut Biodegradable Polymer Sirolimus-Eluting Stent Versus Durable Polymer Everolimus-Eluting Stent for Percutaneous Coronary Revascularisation (BIOSCIENCE) trial to compare the performance of BP-SES and DP-EES in patients with diabetes mellitus. METHODS AND RESULTS BIOSCIENCE trial was an investigator-initiated, single-blind, multicentre, randomized, noninferiority trial comparing BP-SES versus DP-EES. The primary end point, target lesion failure, was a composite of cardiac death, target-vessel myocardial infarction, and clinically indicated target lesion revascularization within 12 months. Among a total of 2119 patients enrolled between February 2012 and May 2013, 486 (22.9%) had diabetes mellitus. Overall diabetic patients experienced a significantly higher risk of target lesion failure compared with patients without diabetes mellitus (10.1% versus 5.7%; hazard ratio [HR], 1.80; 95% confidence interval [CI], 1.27-2.56; P=0.001). At 1 year, there were no differences between BP-SES versus DP-EES in terms of the primary end point in both diabetic (10.9% versus 9.3%; HR, 1.19; 95% CI, 0.67-2.10; P=0.56) and nondiabetic patients (5.3% versus 6.0%; HR, 0.88; 95% CI, 0.58-1.33; P=0.55). Similarly, no significant differences in the risk of definite or probable stent thrombosis were recorded according to treatment arm in both study groups (4.0% versus 3.1%; HR, 1.30; 95% CI, 0.49-3.41; P=0.60 for diabetic patients and 2.4% versus 3.4%; HR, 0.70; 95% CI, 0.39-1.25; P=0.23, in nondiabetics). CONCLUSIONS In the prespecified subgroup analysis of the BIOSCIENCE trial, clinical outcomes among diabetic patients treated with BP-SES or DP-EES were comparable at 1 year. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT01443104.
Resumo:
INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.
Resumo:
HIV-infection is an important risk factor for developing Kaposi sarcoma (KS), but it is unclear whether HIV-positive persons are also at increased risk of co-infection with human herpesvirus 8 (HHV-8), the infectious cause of KS. We systematically searched literature up to December 2012 and included studies reporting HHV-8 seroprevalence for HIV-positive and HIV-negative persons. We used random-effects meta-analysis to combine odds ratios (ORs) of the association between HIV and HHV-8 seropositivity and conducted random-effects meta-regression to identify sources of heterogeneity. We included 93 studies with 58,357 participants from 32 countries in sub-Saharan Africa, North and South America, Europe, Asia, and Australia. Overall, HIV-positive persons were more likely to be HHV-8 seropositive than HIV-negative persons (OR 1.99, 95% confidence interval [CI] 1.70-2.34) with considerable heterogeneity among studies (I(2) 84%). The association was strongest in men who have sex with men (MSM, OR 3.95, 95% CI 2.92-5.35), patients with hemophilia (OR 3.11, 95% CI 1.19-8.11), and children (OR 2.45, 95% CI 1.58-3.81), but weaker in heterosexuals who engage in low-risk (OR 1.42, 95% CI 1.16-1.74) or high-risk sexual behavior (OR 1.66, 95% CI 1.27-2.17), persons who inject drugs (OR 1.66, 95% CI 1.28-2.14), and pregnant women (OR 1.68, 95% CI 1.15-2.47), p value for interaction <0.001. In conclusion, HIV-infection was associated with an increased HHV-8 seroprevalence in all population groups examined. A better understanding of HHV-8 transmission in different age and behavioral groups is needed to develop strategies to prevent HHV-8 transmission.
Resumo:
BACKGROUND Chronic haemodialysis patients are a high-risk population for meticillin-resistant Staphylococcus aureus (MRSA) colonization, which is a precursor of infection. AIM To summarize the effect of nasal (± whole-body wash) MRSA decolonization in haemodialysis patients by means of a systematic review and meta-analysis. METHODS We identified eligible studies using Medline, Embase, the Cochrane database, clinicaltrials.org, and conference abstracts investigating the success of MRSA decolonization in haemodialysis patients. For the statistical analysis, we used Stata 13 to express study-specific proportions with 95% confidence intervals. A likelihood ratio test was used to assess inter-study heterogeneity. FINDINGS Six published prospective cohort studies and one study described in a conference abstract met our inclusion criteria. From 1150 haemodialysis patients enrolled in these studies, MRSA was isolated from nasal swabs of 147 (12.8%) patients. Six of the trials used mupirocin nasal ointment and combined it with chlorhexidine body washes for decolonization. The most widely used protocol was a five-day course of mupirocin nasal ointment application three times a day, and chlorhexidine body wash once daily. The pooled success rate of decolonization was 0.88 (95% confidence interval: 0.75-0.95). A likelihood ratio test of the fixed versus the random-effects model showed significant inter-study heterogeneity (P = 0.047). Four of seven studies determined subsequent MRSA infections in 94 carriers overall, two (2%) of which experienced infection. CONCLUSION The use of mupirocin together with whole-body decolonization is highly effective in eradicating MRSA carriage in haemodialysis patients. The current literature, however, is characterized by a lack of comparative effectiveness studies for this intervention.
Resumo:
An in-depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of observations to obtain the most accurate orbit propagation. The accuracy of the results of an orbit determination/ improvement process depends on: tracklet length, number of observations, type of orbit, astrometric error, time interval between tracklets and observation geometry. The latter depends on the position of the object along its orbit and the location of the observing station. This covariance analysis aims to optimize the observation strategy taking into account the influence of the orbit shape, of the relative object-observer geometry and the interval between observations.
Resumo:
The Astronomical Institute of the University of Bern (AIUB) is conducting several search campaigns for space debris using optical sensors. The debris objects are discovered during systematic survey observations. In general, the result of a discovery consists in only a short observation arc, or tracklet, which is used to perform a first orbit determination in order to be able to observe t he object again in subsequent follow-up observations. The additional observations are used in the orbit improvement process to obtain accurate orbits to be included in a catalogue. In order to obtain the most accurate orbit within the time available it is necessary to optimize the follow-up observations strategy. In this paper an in‐depth study, using simulations and covariance analysis, is performed to identify the optimal sequence of follow-up observations to obtain the most accurate orbit propagation to be used for the space debris catalogue maintenance. The main factors that determine the accuracy of the results of an orbit determination/improvement process are: tracklet length, number of observations, type of orbit, astrometric error of the measurements, time interval between tracklets, and the relative position of the object along its orbit with respect to the observing station. The main aim of the covariance analysis is to optimize the follow-up strategy as a function of the object-observer geometry, the interval between follow-up observations and the shape of the orbit. This an alysis can be applied to every orbital regime but particular attention was dedicated to geostationary, Molniya, and geostationary transfer orbits. Finally the case with more than two follow-up observations and the influence of a second observing station are also analyzed.
Resumo:
BACKGROUND: In equine laminitis, the deep digital flexor muscle (DDFM) appears to have increased muscle force, but evidence-based confirmation is lacking. OBJECTIVES: The purpose of this study was to test if the DDFM of laminitic equines has an increased muscle force detectable by needle electromyography interference pattern analysis (IPA). ANIMALS AND METHODS: The control group included six Royal Dutch Sport horses, three Shetland ponies and one Welsh pony [10 healthy, sound adults weighing 411 ± 217 kg (mean ± SD) and aged 10 ± 5 years]. The laminitic group included three Royal Dutch Sport horses, one Friesian, one Haflinger, one Icelandic horse, one Welsh pony, one miniature Appaloosa and six Shetland ponies (14 adults, weight 310 ± 178 kg, aged 13 ± 6 years) with acute/chronic laminitis. The electromyography IPA measurements included firing rate, turns/second (T), amplitude/turn (M) and M/T ratio. Statistical analysis used a general linear model with outcomes transformed to geometric means. RESULTS: The firing rate of the total laminitic group was higher than the total control group. This difference was smaller for the ponies compared to the horses; in the horses, the geometric mean difference of the laminitic group was 1.73 [geometric 95% confidence interval (CI) 1.29-2.32], and in the ponies this value was 1.09 (geometric 95% CI 0.82-1.45). CONCLUSION AND CLINICAL RELEVANCE: In human medicine, an increased firing rate is characteristic of increased muscle force. Thus, the increased firing rate of the DDFM in the context of laminitis suggests an elevated muscle force. However, this seems to be only a partial effect as in this study, the unchanged turns/second and amplitude/turn failed to prove the recruitment of larger motor units with larger amplitude motor unit potentials in laminitic equids.
Resumo:
AIMS The preferred antithrombotic strategy for secondary prevention in patients with cryptogenic stroke (CS) and patent foramen ovale (PFO) is unknown. We pooled multiple observational studies and used propensity score-based methods to estimate the comparative effectiveness of oral anticoagulation (OAC) compared with antiplatelet therapy (APT). METHODS AND RESULTS Individual participant data from 12 databases of medically treated patients with CS and PFO were analysed with Cox regression models, to estimate database-specific hazard ratios (HRs) comparing OAC with APT, for both the primary composite outcome [recurrent stroke, transient ischaemic attack (TIA), or death] and stroke alone. Propensity scores were applied via inverse probability of treatment weighting to control for confounding. We synthesized database-specific HRs using random-effects meta-analysis models. This analysis included 2385 (OAC = 804 and APT = 1581) patients with 227 composite endpoints (stroke/TIA/death). The difference between OAC and APT was not statistically significant for the primary composite outcome [adjusted HR = 0.76, 95% confidence interval (CI) 0.52-1.12] or for the secondary outcome of stroke alone (adjusted HR = 0.75, 95% CI 0.44-1.27). Results were consistent in analyses applying alternative weighting schemes, with the exception that OAC had a statistically significant beneficial effect on the composite outcome in analyses standardized to the patient population who actually received APT (adjusted HR = 0.64, 95% CI 0.42-0.99). Subgroup analyses did not detect statistically significant heterogeneity of treatment effects across clinically important patient groups. CONCLUSION We did not find a statistically significant difference comparing OAC with APT; our results justify randomized trials comparing different antithrombotic approaches in these patients.
Resumo:
Although recent guidelines recommend the combination of calcium channel blockers (CCBs) and thiazide (-like) diuretics, this combination is not widely used in clinical practice. The aim of this meta-analysis was to assess the efficacy and safety of this combination regarding the following endpoints: all-cause and cardiovascular mortality, myocardial infarction, and stroke. Four studies with a total of 30,791 of patients met the inclusion criteria. The combination CCB/thiazide (-like) diuretic was associated with a significant risk reduction for myocardial infarction (risk ratio [RR], 0.83; 95% confidence interval [CI], 0.73-0.95) and stroke (RR, 0.77; CI, 0.64-0.92) compared with other combinations, whereas it was similarly effective compared with other combinations in reducing the risk of all-cause (RR, 0.89; CI, 0.75-1.06) and cardiovascular (RR, 0.89; CI 0.71-1.10) mortality. Elderly patients with isolated systolic hypertension may particularly benefit from such a combination, since both drug classes have been shown to confer cerebrovascular protection.
Resumo:
Importance In treatment-resistant schizophrenia, clozapine is considered the standard treatment. However, clozapine use has restrictions owing to its many adverse effects. Moreover, an increasing number of randomized clinical trials (RCTs) of other antipsychotics have been published. Objective To integrate all the randomized evidence from the available antipsychotics used for treatment-resistant schizophrenia by performing a network meta-analysis. Data Sources MEDLINE, EMBASE, Biosis, PsycINFO, PubMed, Cochrane Central Register of Controlled Trials, World Health Organization International Trial Registry, and clinicaltrials.gov were searched up to June 30, 2014. Study Selection At least 2 independent reviewers selected published and unpublished single- and double-blind RCTs in treatment-resistant schizophrenia (any study-defined criterion) that compared any antipsychotic (at any dose and in any form of administration) with another antipsychotic or placebo. Data Extraction and Synthesis At least 2 independent reviewers extracted all data into standard forms and assessed the quality of all included trials with the Cochrane Collaboration's risk-of-bias tool. Data were pooled using a random-effects model in a Bayesian setting. Main Outcomes and Measures The primary outcome was efficacy as measured by overall change in symptoms of schizophrenia. Secondary outcomes included change in positive and negative symptoms of schizophrenia, categorical response to treatment, dropouts for any reason and for inefficacy of treatment, and important adverse events. Results Forty blinded RCTs with 5172 unique participants (71.5% men; mean [SD] age, 38.8 [3.7] years) were included in the analysis. Few significant differences were found in all outcomes. In the primary outcome (reported as standardized mean difference; 95% credible interval), olanzapine was more effective than quetiapine (-0.29; -0.56 to -0.02), haloperidol (-0. 29; -0.44 to -0.13), and sertindole (-0.46; -0.80 to -0.06); clozapine was more effective than haloperidol (-0.22; -0.38 to -0.07) and sertindole (-0.40; -0.74 to -0.04); and risperidone was more effective than sertindole (-0.32; -0.63 to -0.01). A pattern of superiority for olanzapine, clozapine, and risperidone was seen in other efficacy outcomes, but results were not consistent and effect sizes were usually small. In addition, relatively few RCTs were available for antipsychotics other than clozapine, haloperidol, olanzapine, and risperidone. The most surprising finding was that clozapine was not significantly better than most other drugs. Conclusions and Relevance Insufficient evidence exists on which antipsychotic is more efficacious for patients with treatment-resistant schizophrenia, and blinded RCTs-in contrast to unblinded, randomized effectiveness studies-provide little evidence of the superiority of clozapine compared with other second-generation antipsychotics. Future clozapine studies with high doses and patients with extremely treatment-refractory schizophrenia might be most promising to change the current evidence.
Resumo:
OBJECTIVE The objective was to determine the risk of stroke associated with subclinical hypothyroidism. DATA SOURCES AND STUDY SELECTION Published prospective cohort studies were identified through a systematic search through November 2013 without restrictions in several databases. Unpublished studies were identified through the Thyroid Studies Collaboration. We collected individual participant data on thyroid function and stroke outcome. Euthyroidism was defined as TSH levels of 0.45-4.49 mIU/L, and subclinical hypothyroidism was defined as TSH levels of 4.5-19.9 mIU/L with normal T4 levels. DATA EXTRACTION AND SYNTHESIS We collected individual participant data on 47 573 adults (3451 subclinical hypothyroidism) from 17 cohorts and followed up from 1972-2014 (489 192 person-years). Age- and sex-adjusted pooled hazard ratios (HRs) for participants with subclinical hypothyroidism compared to euthyroidism were 1.05 (95% confidence interval [CI], 0.91-1.21) for stroke events (combined fatal and nonfatal stroke) and 1.07 (95% CI, 0.80-1.42) for fatal stroke. Stratified by age, the HR for stroke events was 3.32 (95% CI, 1.25-8.80) for individuals aged 18-49 years. There was an increased risk of fatal stroke in the age groups 18-49 and 50-64 years, with a HR of 4.22 (95% CI, 1.08-16.55) and 2.86 (95% CI, 1.31-6.26), respectively (p trend 0.04). We found no increased risk for those 65-79 years old (HR, 1.00; 95% CI, 0.86-1.18) or ≥ 80 years old (HR, 1.31; 95% CI, 0.79-2.18). There was a pattern of increased risk of fatal stroke with higher TSH concentrations. CONCLUSIONS Although no overall effect of subclinical hypothyroidism on stroke could be demonstrated, an increased risk in subjects younger than 65 years and those with higher TSH concentrations was observed.
Resumo:
OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.