31 resultados para drug urine level


Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND The safety and efficacy of drug-eluting stents (DES) in the treatment of coronary artery disease have been assessed in several randomised trials. However, none of these trials were powered to assess the safety and efficacy of DES in women because only a small proportion of recruited participants were women. We therefore investigated the safety and efficacy of DES in female patients during long-term follow-up. METHODS We pooled patient-level data for female participants from 26 randomised trials of DES and analysed outcomes according to stent type (bare-metal stents, early-generation DES, and newer-generation DES). The primary safety endpoint was a composite of death or myocardial infarction. The secondary safety endpoint was definite or probable stent thrombosis. The primary efficacy endpoint was target-lesion revascularisation. Analysis was by intention to treat. FINDINGS Of 43,904 patients recruited in 26 trials of DES, 11,557 (26·3%) were women (mean age 67·1 years [SD 10·6]). 1108 (9·6%) women received bare-metal stents, 4171 (36·1%) early-generation DES, and 6278 (54·3%) newer-generation DES. At 3 years, estimated cumulative incidence of the composite of death or myocardial infarction occurred in 132 (12·8%) women in the bare-metal stent group, 421 (10·9%) in the early-generation DES group, and 496 (9·2%) in the newer-generation DES group (p=0·001). Definite or probable stent thrombosis occurred in 13 (1·3%), 79 (2·1%), and 66 (1·1%) women in the bare-metal stent, early-generation DES, and newer-generation DES groups, respectively (p=0·01). The use of DES was associated with a significant reduction in the 3 year rates of target-lesion revascularisation (197 [18·6%] women in the bare-metal stent group, 294 [7·8%] in the early-generation DES group, and 330 [6·3%] in the newer-generation DES group, p<0·0001). Results did not change after adjustment for baseline characteristics in the multivariable analysis. INTERPRETATION The use of DES in women is more effective and safe than is use of bare-metal stents during long-term follow-up. Newer-generation DES are associated with an improved safety profile compared with early-generation DES, and should therefore be thought of as the standard of care for percutaneous coronary revascularisation in women. FUNDING Women in Innovation Initiative of the Society of Cardiovascular Angiography and Interventions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES The purpose of this study was to compare the 2-year safety and effectiveness of new- versus early-generation drug-eluting stents (DES) according to the severity of coronary artery disease (CAD) as assessed by the SYNTAX (Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery) score. BACKGROUND New-generation DES are considered the standard-of-care in patients with CAD undergoing percutaneous coronary intervention. However, there are few data investigating the effects of new- over early-generation DES according to the anatomic complexity of CAD. METHODS Patient-level data from 4 contemporary, all-comers trials were pooled. The primary device-oriented clinical endpoint was the composite of cardiac death, myocardial infarction, or ischemia-driven target-lesion revascularization (TLR). The principal effectiveness and safety endpoints were TLR and definite stent thrombosis (ST), respectively. Adjusted hazard ratios (HRs) with 95% confidence intervals (CIs) were calculated at 2 years for overall comparisons, as well as stratified for patients with lower (SYNTAX score ≤11) and higher complexity (SYNTAX score >11). RESULTS A total of 6,081 patients were included in the study. New-generation DES (n = 4,554) compared with early-generation DES (n = 1,527) reduced the primary endpoint (HR: 0.75 [95% CI: 0.63 to 0.89]; p = 0.001) without interaction (p = 0.219) between patients with lower (HR: 0.86 [95% CI: 0.64 to 1.16]; p = 0.322) versus higher CAD complexity (HR: 0.68 [95% CI: 0.54 to 0.85]; p = 0.001). In patients with SYNTAX score >11, new-generation DES significantly reduced TLR (HR: 0.36 [95% CI: 0.26 to 0.51]; p < 0.001) and definite ST (HR: 0.28 [95% CI: 0.15 to 0.55]; p < 0.001) to a greater extent than in the low-complexity group (TLR pint = 0.059; ST pint = 0.013). New-generation DES decreased the risk of cardiac mortality in patients with SYNTAX score >11 (HR: 0.45 [95% CI: 0.27 to 0.76]; p = 0.003) but not in patients with SYNTAX score ≤11 (pint = 0.042). CONCLUSIONS New-generation DES improve clinical outcomes compared with early-generation DES, with a greater safety and effectiveness in patients with SYNTAX score >11.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND Diabetes mellitus and angiographic coronary artery disease complexity are intertwined and unfavorably affect prognosis after percutaneous coronary interventions, but their relative impact on long-term outcomes after percutaneous coronary intervention with drug-eluting stents remains controversial. This study determined drug-eluting stents outcomes in relation to diabetic status and coronary artery disease complexity as assessed by the Synergy Between PCI With Taxus and Cardiac Surgery (SYNTAX) score. METHODS AND RESULTS In a patient-level pooled analysis from 4 all-comers trials, 6081 patients were stratified according to diabetic status and according to the median SYNTAX score ≤11 or >11. The primary end point was major adverse cardiac events, a composite of cardiac death, myocardial infarction, and clinically indicated target lesion revascularization within 2 years. Diabetes mellitus was present in 1310 patients (22%), and new-generation drug-eluting stents were used in 4554 patients (75%). Major adverse cardiac events occurred in 173 diabetics (14.5%) and 436 nondiabetic patients (9.9%; P<0.001). In adjusted Cox regression analyses, SYNTAX score and diabetes mellitus were both associated with the primary end point (P<0.001 and P=0.028, respectively; P for interaction, 0.07). In multivariable analyses, diabetic versus nondiabetic patients had higher risks of major adverse cardiac events (hazard ratio, 1.25; 95% confidence interval, 1.03-1.53; P=0.026) and target lesion revascularization (hazard ratio, 1.54; 95% confidence interval, 1.18-2.01; P=0.002) but similar risks of cardiac death (hazard ratio, 1.41; 95% confidence interval, 0.96-2.07; P=0.08) and myocardial infarction (hazard ratio, 0.89; 95% confidence interval, 0.64-1.22; P=0.45), without significant interaction with SYNTAX score ≤11 or >11 for any of the end points. CONCLUSIONS In this population treated with predominantly new-generation drug-eluting stents, diabetic patients were at increased risk for repeat target-lesion revascularization consistently across the spectrum of disease complexity. The SYNTAX score was an independent predictor of 2-year outcomes but did not modify the respective effect of diabetes mellitus. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifiers: NCT00297661, NCT00389220, NCT00617084, and NCT01443104.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Differentiation between external contamination and incorporation of drugs or their metabolites from inside the body via blood, sweat or sebum is a general issue in hair analysis and of high concern when interpreting analytical results. In hair analysis for cannabinoids the most common target is Delta9-tetrahydrocannabinol (THC), sometimes cannabidiol (CBD) and cannabinol (CBN) are determined additionally. After repeated external contamination by cannabis smoke these analytes are known to be found in hair even after performing multiple washing steps. A widely accepted strategy to unequivocally prove active cannabis consumption is the analysis of hair extracts for the oxidative metabolite 11-nor-9-carboxy-THC (THC-COOH). Although the acidic nature of this metabolite suggests a lower rate of incorporation into the hair matrix compared to THC, it is not fully understood up to now why hair concentrations of THC-COOH are generally found to be much lower (mostly <10 pg/mg) than the corresponding THC concentrations. Delta9-Tetrahydrocannabinolic acid A (THCA A) is the preliminary end product of the THC biosynthesis in the cannabis plant. Unlike THC it is non-psychoactive and can be regarded as a 'precursor' of THC being largely decarboxylated when heated or smoked. The presented work shows for the first time that THCA A is not only detectable in blood and urine of cannabis consumers but also in THC positive hair samples. A pilot experiment performed within this study showed that after oral intake of THCA A on a regular basis no relevant incorporation into hair occurred. It can be concluded that THCA A in hair almost exclusively derives from external contamination e.g. by side stream smoke. Elevated temperatures during the analytical procedure, particularly under alkaline conditions, can lead to decarboxylation of THCA A and accordingly increase THC concentrations in hair. Additionally, it has to be kept in mind that in hair samples tested positive for THCA A at least a part of the 'non-artefact' THC probably derives from external contamination as well, because in condensate of cannabis smoke both THC and THCA A are present in relevant amounts. External contamination by side stream smoke could therefore explain the great differences in THC and THC-COOH hair concentrations commonly found in cannabis users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bacterial factors may contribute to the global emergence and spread of drug-resistant tuberculosis (TB). Only a few studies have reported on the interactions between different bacterial factors. We studied drug-resistant Mycobacterium tuberculosis isolates from a nationwide study conducted from 2000 to 2008 in Switzerland. We determined quantitative drug resistance levels of first-line drugs by using Bactec MGIT-960 and drug resistance genotypes by sequencing the hot-spot regions of the relevant genes. We determined recent transmission by molecular methods and collected clinical data. Overall, we analyzed 158 isolates that were resistant to isoniazid, rifampin, or ethambutol, 48 (30.4%) of which were multidrug resistant. Among 154 isoniazid-resistant strains, katG mutations were associated with high-level and inhA promoter mutations with low-level drug resistance. Only katG(S315T) (65.6% of all isoniazid-resistant strains) and inhA promoter -15C/T (22.7%) were found in molecular clusters. M. tuberculosis lineage 2 (includes Beijing genotype) was associated with any drug resistance (adjusted odds ratio [OR], 3.0; 95% confidence interval [CI], 1.7 to 5.6; P < 0.0001). Lineage 1 was associated with inhA promoter -15C/T mutations (OR, 6.4; 95% CI, 2.0 to 20.7; P = 0.002). We found that the genetic strain background influences the level of isoniazid resistance conveyed by particular mutations (interaction tests of drug resistance mutations across all lineages; P < 0.0001). In conclusion, M. tuberculosis drug resistance mutations were associated with various levels of drug resistance and transmission, and M. tuberculosis lineages were associated with particular drug resistance-conferring mutations and phenotypic drug resistance. Our study also supports a role for epistatic interactions between different drug resistance mutations and strain genetic backgrounds in M. tuberculosis drug resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background There is a lack of international research on suicide by drug overdose as a preventable suicide method. Sex- and age-specific rates of suicide by drug self-poisoning (ICD-10, X60-64) and the distribution of drug types used in 16 European countries were studied, and compared with other self-poisoning methods (X65-69) and intentional self-injury (X70-84). Methods Data for 2000-04/05 were collected from national statistical offices. Age-adjusted suicide rates, and age and sex distributions, were calculated. Results No pronounced sex differences in drug self-poisoning rates were found, either in the aggregate data (males 1.6 and females 1.5 per 100,000) or within individual countries. Among the 16 countries, the range (from some 0.3 in Portugal to 5.0 in Finland) was wide. 'Other and unspecified drugs' (X64) were recorded most frequently, with a range of 0.2-1.9, and accounted for more than 70% of deaths by drug overdose in France, Luxembourg, Portugal and Spain. Psychotropic drugs (X61) ranked second. The X63 category ('other drugs acting on the autonomic nervous system') was least frequently used. Finland showed low X64 and high X61 figures, Scotland had high levels of X62 ('narcotics and hallucinogens, not elsewhere classified') for both sexes, while England exceeded other countries in category X60. Risk was highest among the middle-aged everywhere except in Switzerland, where the elderly were most at risk. Conclusions Suicide by drug overdose is preventable. Intentional self-poisoning with drugs kills as many males as females. The considerable differences in patterns of self-poisoning found in the various European countries are relevant to national efforts to improve diagnostics of suicide and appropriate specific prevention. The fact that vast majority of drug-overdose suicides came under the category X64 refers to the need of more detailed ICD coding system for overdose suicides is needed to permit better design of suicide-prevention strategies at national level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pre-treatment of tumour neovessels by low-level photodynamic therapy (PDT) improves the distribution of concomitantly administered systemic chemotherapy. The mechanism by which PDT permeabilizes the tumour vessel wall is only partially known. We have recently shown that leukocyte-endothelial cell interaction is essential for photodynamic drug delivery to normal tissue. The present study investigates whether PDT enhances drug delivery in malignant mesothelioma and whether it involves comparable mechanisms of actions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ketamine is widely used as an anesthetic in a variety of drug combinations in human and veterinary medicine. Recently, it gained new interest for use in long-term pain therapy administered in sub-anesthetic doses in humans and animals. The purpose of this study was to develop a physiologically based pharmacokinetic (PBPk) model for ketamine in ponies and to investigate the effect of low-dose ketamine infusion on the amplitude and the duration of the nociceptive withdrawal reflex (NWR). A target-controlled infusion (TCI) of ketamine with a target plasma level of 1 microg/ml S-ketamine over 120 min under isoflurane anesthesia was performed in Shetland ponies. A quantitative electromyographic assessment of the NWR was done before, during and after the TCI. Plasma levels of R-/S-ketamine and R-/S-norketamine were determined by enantioselective capillary electrophoresis. These data and two additional data sets from bolus studies were used to build a PBPk model for ketamine in ponies. The peak-to-peak amplitude and the duration of the NWR decreased significantly during TCI and returned slowly toward baseline values after the end of TCI. The PBPk model provides reliable prediction of plasma and tissue levels of R- and S-ketamine and R- and S-norketamine. Furthermore, biotransformation of ketamine takes place in the liver and in the lung via first-pass metabolism. Plasma concentrations of S-norketamine were higher compared to R-norketamine during TCI at all time points. Analysis of the data suggested identical biotransformation rates from the parent compounds to the principle metabolites (R- and S-norketamine) but different downstream metabolism to further metabolites. The PBPk model can provide predictions of R- and S-ketamine and norketamine concentrations in other clinical settings (e.g. horses).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human aurora family of serine-threonine kinases comprises three members, which act in concert with many other proteins to control chromosome assembly and segregation during mitosis. Aurora dysfunction can cause aneuploidy, mitotic arrest, and cell death. Aurora kinases are strongly expressed in a broad range of cancer types. Aurora A expression in tumors is often associated with gene amplification, genetic instability, poor histologic differentiation, and poor prognosis. Aurora B is frequently expressed at high levels in a variety of tumors, often coincidently with aurora A, and expression level has also been associated with increased genetic instability and clinical outcome. Further, aurora kinase gene polymorphisms are associated with increased risk or early onset of cancer. The expression of aurora C in cancer is less well studied. In recent years, several small-molecule aurora kinase inhibitors have been developed that exhibit preclinical activity against a wide range of solid tumors. Preliminary clinical data from phase I trials have largely been consistent with cytostatic effects, with disease stabilization as the best response achieved in solid tumors. Objective responses have been noted in leukemia patients, although this might conceivably be due to inhibition of the Abl kinase. Current challenges include the optimization of drug administration, the identification of potential biomarkers of tumor sensitivity, and combination studies with cytotoxic drugs. Here, we summarize the most recent preclinical and clinical data and discuss new directions in the development of aurora kinase inhibitors as antineoplastic agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OX7 monoclonal antibody F((ab')2) fragments directed against Thy1.1 antigen can be used for drug targeting by coupling to the surface of drug-loaded liposomes. Such OX7-conjugated immunoliposomes (OX7-IL) were used recently for drug delivery to rat glomerular mesangial cells, which are characterized by a high level of Thy1.1 antigen expression. In the present study, the relationship between OX7-IL tissue distribution and target Thy1.1 antigen localization in different organs in rat was investigated. Western blot and immunohistofluorescence analysis revealed a very high Thy1.1 expression in brain cortex and striatum, thymus and renal glomeruli. Moderate Thy1.1 levels were observed in the collecting ducts of kidney, lung tissue and spleen. Thy1.1 was not detected in liver and heart. There was a poor correlation between Thy1.1 expression levels and organ distribution of fluorescence- or (14)C-labeled OX7-IL. The highest overall organ density of OX7-IL was observed in the spleen, followed by lung, liver and kidney. Heart and brain remained negative. With respect to intra-organ distribution, a localized and distinct signal was observed in renal glomerular mesangial cells only. As a consequence, acute pharmacological (i.e. toxic) effects of doxorubicin-loaded OX7-IL were limited to renal glomeruli. The competition with unbound OX7 monoclonal antibody F((ab')2) fragments demonstrated that the observed tissue distribution and acute pharmacological effects of OX7-IL were mediated specifically by the conjugated OX7 antibody. It is concluded that both the high target antigen density and the absence of endothelial barriers are needed to allow for tissue-specific accumulation and pharmacological effects of OX7-IL. The liposomal drug delivery strategy used is therefore specific toward renal glomeruli and can be expected to reduce the risk of unwanted side effects in other tissues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND There is ongoing debate on the optimal drug-eluting stent (DES) in diabetic patients with coronary artery disease. Biodegradable polymer drug-eluting stents (BP-DES) may potentially improve clinical outcomes in these high-risk patients. We sought to compare long-term outcomes in patients with diabetes treated with biodegradable polymer DES vs. durable polymer sirolimus-eluting stents (SES). METHODS We pooled individual patient-level data from 3 randomized clinical trials (ISAR-TEST 3, ISAR-TEST 4 and LEADERS) comparing biodegradable polymer DES with durable polymer SES. Clinical outcomes out to 4years were assessed. The primary end point was the composite of cardiac death, myocardial infarction and target-lesion revascularization. Secondary end points were target lesion revascularization and definite or probable stent thrombosis. RESULTS Of 1094 patients with diabetes included in the present analysis, 657 received biodegradable polymer DES and 437 durable polymer SES. At 4years, the incidence of the primary end point was similar with BP-DES versus SES (hazard ratio=0.95, 95% CI=0.74-1.21, P=0.67). Target lesion revascularization was also comparable between the groups (hazard ratio=0.89, 95% CI=0.65-1.22, P=0.47). Definite or probable stent thrombosis was significantly reduced among patients treated with BP-DES (hazard ratio=0.52, 95% CI=0.28-0.96, P=0.04), a difference driven by significantly lower stent thrombosis rates with BP-DES between 1 and 4years (hazard ratio=0.15, 95% CI=0.03-0.70, P=0.02). CONCLUSIONS In patients with diabetes, biodegradable polymer DES, compared to durable polymer SES, were associated with comparable overall clinical outcomes during follow-up to 4years. Rates of stent thrombosis were significantly lower with BP-DES.