82 resultados para Cardiac Events


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Antifibrinolytics have been used for 2 decades to reduce bleeding in cardiac surgery. MDCO-2010 is a novel, synthetic, serine protease inhibitor. We describe the first experience with this drug in patients. METHODS In this phase II, double-blind, placebo-controlled study, 32 patients undergoing isolated primary coronary artery bypass grafting with cardiopulmonary bypass were randomly assigned to 1 of 5 increasing dosage groups of MDCO-2010. The primary aim was to evaluate pharmacokinetics (PK) with assessment of plasmatic concentrations of the drug, short-term safety, and tolerance of MDCO-2010. Secondary end points were influence on coagulation, chest tube drainage, and transfusion requirements. RESULTS PK analysis showed linear dosage-proportional correlation between MDCO-2010 infusion rate and PK parameters. Blood loss was significantly reduced in the 3 highest dosage groups compared with control (P = 0.002, 0.004 and 0.011, respectively). The incidence of allogeneic blood product transfusions was lower with MDCO-2010 4/24 (17%) vs 4/8 (50%) in the control group. MDCO-2010 exhibited dosage-dependent antifibrinolytic effects through suppression of D-dimer generation and inhibition of tissue plasminogen activator-induced lysis in ROTEM analysis as well as anticoagulant effects demonstrated by prolongation of activated clotting time and activated partial thromboplastin time. No systematic differences in markers of end organ function were observed among treatment groups. Three patients in the MDCO-2010 groups experienced serious adverse events. One patient experienced intraoperative thrombosis of venous grafts considered possibly related to the study drug. No reexploration for mediastinal bleeding was required, and there were no deaths. CONCLUSIONS This first-in-patient study demonstrated dosage-proportional PK for MDCO-2010 and reduction of chest tube drainage and transfusions in patients undergoing primary coronary artery bypass grafting. Antifibrinolytic and anticoagulant effects were demonstrated using various markers of coagulation. MDCO-2010 was well tolerated and showed an acceptable initial safety profile. Larger multi-institutional studies are warranted to further investigate the safety and efficacy of this compound.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Data on pharmacological management during pregnancy are scarce. The aim of this study was to describe the type and frequency of cardiac medication used in pregnancy in patients with cardiovascular disease and to assess the relationship between medication use and fetal outcome. METHODS AND RESULTS Between 2007 and 2011 sixty hospitals in 28 countries enrolled 1321 pregnant women. All patients had structural heart disease (congenital 66%, valvular 25% or cardiomyopathy 7% or ischemic 2%). Medication was used by 424 patients (32%) at some time during pregnancy: 22% used beta-blockers, 8% antiplatelet agents, 7% diuretics, 2.8% ACE inhibitors and 0.5% statins. Compared to those who did not take medication, patients taking medication were older, more likely to be parous, have valvular heart disease and were less often in sinus rhythm. The odds ratio of fetal adverse events in users versus non-users of medication was 2.6 (95% CI 2.0-3.4) and after adjustment for cardiac and obstetric parameter was 2.0 (95% CI 1.4-2.7). Babies of patients treated with beta-blockers had a significantly lower adjusted birth weight (3140 versus 3240 g, p = 0.002). The highest rate of fetal malformation was found in patients taking ACE inhibitors (8%). CONCLUSION One third of pregnant women with heart disease used cardiac medication during their pregnancy, which was associated with an increased rate of adverse fetal events. Birth weight was significantly lower in children of patients taking beta-blockers. A randomized trial is needed to distinguish the effects of the medication from the effects of the underlying maternal cardiac condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PRINCIPLES Prediction of arrhythmic events (AEs) has gained importance with the availability of implantable cardioverter-defibrillators (ICDs), but is still imprecise. This study evaluated the innovative Wedensky modulation index (WMI) as predictor of AEs. METHODS In this prospective cohort, 179 patients with coronary artery disease (CAD) referred for AE risk assessment underwent baseline evaluation including measurement of R-/T-wave WMI (WMI(RT)) and left ventricular ejection fraction (LVEF). Two endpoints were assessed 3 years after the baseline evaluation: sudden cardiac death or appropriate ICD event (EP1) and any cardiac death or appropriate ICD event (EP2). Associations between baseline predictors (WMI(RT) and LVEF) and endpoints were evaluated in regression models. RESULTS Only three patients were lost to follow-up. EP1 and EP2 occurred in 24 and 27 patients, respectively. WMI(RT) (odds ratio [OR] per 1 point increase for EP1 20.1, 95% confidence interval [CI] 1.8-221.4, p = 0.014, and for EP2 73.3, 95% CI 6.6-817.7, p <0.001) and LVEF (OR per 1% increase for EP1 0.94, 95% CI 0.90-0.99, p = 0.013, and for EP2 0.93, 95% CI 0.89-0.97, p = 0.002) were significantly associated with both endpoints. In bivariable regression controlled for LVEF, WMI(RT) was independently associated with EP1 (p = 0.047) and EP2 (p = 0.007). The combination of WMI(RT) ≥0.60 and LVEF ≤30% resulted in a positive predictive value of 36% for EP1 and 50% for EP2. CONCLUSIONS WMI(RT) is a significant predictor of AEs independent of LVEF and has potential to improve AE risk prediction in CAD patients. However, WMI(RT) should be evaluated in larger and independent samples before recommendations for clinical routine can be made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Left ventricular thrombus (LVT) formation may worsen the post-infarct outcome as a result of thromboembolic events. It also complicates the use of modern antiplatelet regimens, which are not compatible with long-term oral anticoagulation. The knowledge of the incidence of LVT may therefore be of importance to guide antiplatelet and antithrombotic therapy after acute myocardial infarction (AMI). METHODS In 177 patients with large, mainly anterior AMI, standard cardiac magnetic resonance imaging (CMR) including cine and late gadolinium enhancement (LGE) imaging was performed shortly after AMI as per protocol. CMR images were analysed at an independent core laboratory blinded to the clinical data. Transthoracic echocardiography (TTE) was not mandatory for the trial, but was performed in 64% of the cases following standard of care. In a logistic model, 3 out of 61 parameters were used in a multivariable model to predict LVT. RESULTS LVT was detected by use of CMR in 6.2% (95% confidence interval [CI] 3.1%-10.8%). LGE sequences were best to detect LVT, which may be missed in cine sequences. We identified body mass index (odds ratio 1.18; p = 0.01), baseline platelet count (odds ratio 1.01, p = 0.01) and infarct size as assessed by use of CMR (odds ratio 1.03, p = 0.02) as best predictors for LVT. The agreement between TTE and CMR for the detection of LVT is substantial (kappa = 0.70). DISCUSSION In the current analysis, the incidence of LVT shortly after AMI is relatively low, even in a patient population at high risk. An optimal modality for LVT detection is LGE-CMR but TTE has an acceptable accuracy when LGE-CMR is not available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM To evaluate the prognostic value of electrophysiological stimulation (EPS) in the risk stratification for tachyarrhythmic events and sudden cardiac death (SCD). METHODS We conducted a prospective cohort study and analyzed the long-term follow-up of 265 consecutive patients who underwent programmed ventricular stimulation at the Luzerner Kantonsspital (Lucerne, Switzerland) between October 2003 and April 2012. Patients underwent EPS for SCD risk evaluation because of structural or functional heart disease and/or electrical conduction abnormality and/or after syncope/cardiac arrest. EPS was considered abnormal, if a sustained ventricular tachycardia (VT) was inducible. The primary endpoint of the study was SCD or, in implanted patients, adequate ICD-activation. RESULTS During EPS, sustained VT was induced in 125 patients (47.2%) and non-sustained VT in 60 patients (22.6%); in 80 patients (30.2%) no arrhythmia could be induced. In our cohort, 153 patients (57.7%) underwent ICD implantation after the EPS. During follow-up (mean duration 4.8 ± 2.3 years), a primary endpoint event occurred in 49 patients (18.5%). The area under the receiver operating characteristic curve (AUROC) was 0.593 (95%CI: 0.515-0.670) for a left ventricular ejection fraction (LVEF) < 35% and 0.636 (95%CI: 0.563-0.709) for inducible sustained VT during EPS. The AUROC of EPS was higher in the subgroup of patients with LVEF ≥ 35% (0.681, 95%CI: 0.578-0.785). Cox regression analysis showed that both, sustained VT during EPS (HR: 2.26, 95%CI: 1.22-4.19, P = 0.009) and LVEF < 35% (HR: 2.00, 95%CI: 1.13-3.54, P = 0.018) were independent predictors of primary endpoint events. CONCLUSION EPS provides a benefit in risk stratification for future tachyarrhythmic events and SCD and should especially be considered in patients with LVEF ≥ 35%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obesity and diets rich in uric acid-raising components appear to account for the increased prevalence of hyperuricemia in Westernized populations. Prevalence rates of hypertension, diabetes mellitus, CKD, and cardiovascular disease are also increasing. We used Mendelian randomization to examine whether uric acid is an independent and causal cardiovascular risk factor. Serum uric acid was measured in 3315 patients of the Ludwigshafen Risk and Cardiovascular Health Study. We calculated a weighted genetic risk score (GRS) for uric acid concentration based on eight uric acid-regulating single nucleotide polymorphisms. Causal odds ratios and causal hazard ratios (HRs) were calculated using a two-stage regression estimate with the GRS as the instrumental variable to examine associations with cardiometabolic phenotypes (cross-sectional) and mortality (prospectively) by logistic regression and Cox regression, respectively. Our GRS was not consistently associated with any biochemical marker except for uric acid, arguing against pleiotropy. Uric acid was associated with a range of prevalent diseases, including coronary artery disease. Uric acid and the GRS were both associated with cardiovascular death and sudden cardiac death. In a multivariate model adjusted for factors including medication, causal HRs corresponding to each 1-mg/dl increase in genetically predicted uric acid concentration were significant for cardiovascular death (HR, 1.77; 95% confidence interval, 1.12 to 2.81) and sudden cardiac death (HR, 2.41; 95% confidence interval, 1.16 to 5.00). These results suggest that high uric acid is causally related to adverse cardiovascular outcomes, especially sudden cardiac death.