144 resultados para Cardiac Events
Resumo:
Drosophila antonietae is a cactophilic species that is found in the mesophilic forest of the Parana`-Paraguay river basin and in the dunes of the South Atlantic coast of Brazil. Although the genetic structure of the Parana`-Paraguay river basin populations has already been established, the relationship between these populations and those on the Atlantic coast is controversial. In this study, we compared 33 repetitive units of pBuM-2 satellite DNA isolated from individuals from 8 populations of D. antonietae in these geographic regions, including some populations found within a contact zone with the closely related D. serido. The pBuM-2 sequences showed low interpopulational variability. This result was interpreted as a consequence of both gene flow among the populations and unequal crossing over promoting homogenization of the tandem arrays. The results presented here, together with those of previous studies, highlight the use of pBuM-2 for solving taxonomic conflicts within the D. buzzatii species cluster.
Resumo:
Aim The aim of this study was to assess the causal mechanisms underlying populational subdivision in Drosophila gouveai, a cactophilic species associated with xeric vegetation enclaves in eastern Brazil. A secondary aim was to investigate the genetic effects of Pleistocene climatic fluctuations on these environments. Location Dry vegetation enclaves within the limits of the Cerrado domain in eastern Brazil. Methods We determined the mitochondrial DNA haplotypes of 55 individuals (representing 12 populations) based on sequence data of a 483-bp fragment from the cytochrome c oxidase subunit II (COII) gene. Phylogenetic and coalescent analyses were used to test for the occurrence of demographic events and to infer the time of divergence amongst genetically independent groups. Results Our analyses revealed the existence of two divergent subclades (G1 and G2) plus an introgressed clade restricted to the southernmost range of D. gouveai. Subclades G1 and G2 displayed genetic footprints of range expansion and segregated geographical distributions in south-eastern and some central highland regions, east and west of the Parana River valley. Molecular dating indicated that the main demographic and diversification events occurred in the late to middle Pleistocene. Main conclusions The phylogeographical and genetic patterns observed for D. gouveai in this study are consistent with changes in the distribution of dry vegetation in eastern Brazil. All of the estimates obtained by molecular dating indicate that range expansion and isolation pre-dated the Last Glacial Maximum, occurring during the late to middle Pleistocene, and were probably triggered by climatic changes during the Pleistocene. The current patchy geographical distribution and population subdivision in D. gouveai is apparently closely linked to these past events.
Resumo:
MeCP2 plays a critical role in interpreting epigenetic signatures that command chromatin conformation and regulation of gene transcription. In spite of MeCP2`s ubiquitous expression, its functions have always been considered in the context of brain physiology. In this study, we demonstrate that alterations of the normal pattern of expression of MeCP2 in cardiac and skeletal tissues are detrimental for normal development. Overexpression of MeCP2 in the mouse heart leads to embryonic lethality with cardiac septum hypertrophy and dysregulated expression of MeCP2 in skeletal tissue produces severe malformations. We further show that MeCP2`s expression in the heart is developmentally regulated; further suggesting that it plays a key role in regulating transcriptional programs in non-neural tissues.
Resumo:
Objectives: We compared 12-month outcomes, regarding ischemic events, repeat intervention, and ST, between diabetic and nondiabetic patients treated with the Genous (TM) EPC capturing R stent (TM) during routine nonurgent percutaneous coronary intervention (PCI) using data from the multicenter, prospective worldwide e-HEALING registry. Background: Diabetic patients have an increased risk for restenosis and stent thrombosis (ST). Methods: In the 4,996 patient e-HEALING registry, 273 were insulin requiring diabetics (IRD), 963 were non-IRD (NIRD), and 3,703 were nondiabetics. The 12-month primary outcome was target vessel failure (TVF), defined as target vessel-related cardiac death or myocardial infarction (MI) and target vessel revascularization. Secondary outcomes were the composite of cardiac death, MI or target lesion revascularization (TLR), and individual outcomes including ST. Cumulative event rates were estimated with the Kaplan-Meier method and compared with a log-rank test. Results: TVF rates were respectively 13.4% in IRD, 9.0% in NIRD, and 7.9% in nondiabetics (P < 0.01). This was mainly driven by a higher mortality hazard in IRD (P < 0.001) and NIRD (P = 0.07), compared with nondiabetics. TLR rates were comparable in NIRD and nondiabetics, but significantly higher in IRD (P = 0.04). No difference was observed in ST. Conclusion: The 1-year results of the Genous stent in a real-world population of diabetics show higher TVF rates in diabetics compared with nondiabetics, mainly driven by a higher mortality hazard. IRD is associated with a significant higher TLR hazard. Definite or probable ST in all diabetic patients was comparable with nondiabetics. (J Interven Cardiol 2011;24:285-294)
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.
Resumo:
Background: The presence of coronary artery calcium (CAC) is an independent marker of increased risk of cardiovascular disease (CVD) events and mortality. However, the predictive value of thoracic aorta calcification (TAC), which can be additionally identified without further scanning during assessment of CAC, is unknown. Methods: We followed a cohort of 8401 asymptomatic individuals (mean age: 53 +/- 10 years, 69% men) undergoing cardiac risk factor evaluation and TAC and CAC testing with electron beam computed tomography. Multivariable Cox proportional hazards models were developed to predict all-cause mortality based on the presence of TAC. Results: During a median follow-up period of 5 years, 124 (1.5%) deaths were observed. Overall survival was 96.9% and 98.9% for those with and without detectable TAC, respectively (p < 0.0001). Compared to those with no TAC, the hazard ratio for mortality in the presence of TAC was 3.25 (95% CI: 2.28-4.65, p < 0.0001) in unadjusted analysis. After adjusting for age, gender, hypertension, dyslipidemia, diabetes mellitus, smoking and family history of premature coronary artery disease, and presence of CAC the relationship remained robust (HR 1.61, 95% CI: 1.10-2.27, p = 0.015). Likelihood ratio chi(2) statistics demonstrated that the addition of TAC contributed significantly in predicting mortality to traditional risk factors alone (chi(2) = 13.62, p = 0.002) as well as risk factors + CAC (chi(2) = 5.84, p = 0.02) models. Conclusion: In conclusion, the presence of TAC was associated with all-cause mortality in our study; this relationship was independent of conventional CVD risk factors as well as the presence of CAC. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Objectives: The aim of this study was to determine the correlation between ductus venosus (DV) Doppler velocimetry and fetal cardiac troponin T (cTnT). Study design: Between March 2007 and March 2008, 89 high-risk pregnancies were prospectively studied. All patients delivered by cesarean section and the Doppler exams were performed on the same day. Multiple regression included the following variables: maternial age, parity, hypertension, diabetes, gestational age at delivery, umbilical artery (UA) S/D ratio, diagnosis of absent or reversed end-diastolic flow velocity (AREDV) in the UA, middle cerebral artery (MCA) pulsatility index (131), and DV pulsatility index for veins (PIV). Immediately after delivery, UA blood samples were obtained for the measurement of pH and cTnT levels. Statistical analysis included the Kruskal-Wallis test and multiple regressions. Results: The results showed a cTnT concentration at birth >0.05 ng/ml in nine (81.8%) of AREDV cases, a proportion significantly higher than that observed in normal UA S/D ratio and UA S/D ratio >p95 with positive diastolic blood flow (7.7 and 23.1%, respectively, p < 0.001). A positive correlation Was found between abnormal DV-PIV and elevated cTnT levels in the UA. Multiple regression identified DV-PIV and a diagnosis of AREDV as independent factors associated with abnormal fetal cTnT levels (p < 0.0001, F(2.86) = 63.5, R = 0.7722). Conclusion: DV-PIV was significantly correlated with fetal cTnT concentrations at delivery. AREDV and abnormal DV flow represent severe cardiac compromise, with increased systemic venous pressure, and a rise in right ventricular afterload, demonstrated by myocardial damage and elevated fetal cTnT. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
OBJECTIVE. The purposes of this study were to use the myocardial delayed enhancement technique of cardiac MRI to investigate the frequency of unrecognized myocardial infarction (MI) in patients with end-stage renal disease, to compare the findings with those of ECG and SPECT, and to examine factors that may influence the utility of these methods in the detection of MI. SUBJECTS AND METHODS. We prospectively performed cardiac MRI, ECG, and SPECT to detect unrecognized MI in 72 patients with end-stage renal disease at high risk of coronary artery disease but without a clinical history of MI. RESULTS. Fifty-six patients (78%) were men ( mean age, 56.2 +/- 9.4 years) and 16 (22%) were women ( mean age, 55.8 +/- 11.4). The mean left ventricular mass index was 103.4 +/- 27.3 g/m(2), and the mean ejection fraction was 60.6% +/- 15.5%. Myocardial delayed enhancement imaging depicted unrecognized MI in 18 patients (25%). ECG findings were abnormal in five patients (7%), and SPECT findings were abnormal in 19 patients (26%). ECG findings were false-negative in 14 cases and false-positive in one case. The accuracy, sensitivity, and specificity of ECG were 79.2%, 22.2%, and 98.1% (p = 0.002). SPECT findings were false-negative in six cases and false-positive in seven cases. The accuracy, sensitivity, and specificity of SPECT were 81.9%, 66.7%, and 87.0% ( not significant). During a period of 4.9-77.9 months, 19 cardiac deaths were documented, but no statistical significance was found in survival analysis. CONCLUSION. Cardiac MRI with myocardial delayed enhancement can depict unrecognized MI in patients with end-stage renal disease. ECG and SPECT had low sensitivity in detection of MI. Infarct size and left ventricular mass can influence the utility of these methods in the detection of MI.
Resumo:
(99m)Tc-MIBI gated myocardial scintigraphy (GMS) evaluates myocyte integrity and perfusion, left ventricular (LV) dyssynchrony and function. Cardiac resynchronization therapy (CRT) may improve the clinical symptoms of heart failure (HF), but its benefits for LV function are less pronounced. We assessed whether changes in myocardial (99m)Tc-MIBI uptake after CRT are related to improvement in clinical symptoms, LV synchrony and performance, and whether GMS adds information for patient selection for CRT. A group of 30 patients with severe HF were prospectively studied before and 3 months after CRT. Variables analysed were HF functional class, QRS duration, LV ejection fraction (LVEF) by echocardiography, myocardial (99m)Tc-MIBI uptake, LV end-diastolic volume (EDV) and end-systolic volume (ESV), phase analysis LV dyssynchrony indices, and regional motion by GMS. After CRT, patients were divided into two groups according to improvement in LVEF: group 1 (12 patients) with increase in LVEF of 5 or more points, and group 2 (18 patients) without a significant increase. After CRT, both groups showed a significant improvement in HF functional class, reduced QRS width and increased septal wall (99m)Tc-MIBI uptake. Only group 1 showed favourable changes in EDV, ESV, LV dyssynchrony indices, and regional motion. Before CRT, EDV, and ESV were lower in group 1 than in group 2. Anterior and inferior wall (99m)Tc-MIBI uptakes were higher in group 1 than in group 2 (p < 0.05). EDV was the only independent predictor of an increase in LVEF (p=0.01). The optimal EDV cut-off point was 315 ml (sensitivity 89%, specificity 94%). The evaluation of EDV by GMS added information on patient selection for CRT. After CRT, LVEF increase occurred in hearts less dilated and with more normal (99m)Tc-MIBI uptake.
Resumo:
Although a new protocol of dobutamine stress echocardiography with the early injection of atropine (EA-DSE) has been demonstrated to be useful in reducing adverse effects and increasing the number of effective tests and to have similar accuracy for detecting coronary artery disease (CAD) compared with conventional protocols, no data exist regarding its ability to predict long-term events. The aim of this study was to determine the prognostic value of EA-DSE and the effects of the long-term use of beta blockers on it. A retrospective evaluation of 844 patients who underwent EA-DSE for known or suspected CAD was performed; 309 (37%) were receiving beta blockers. During a median follow-up period of 24 months, 102 events (12%) occurred. On univariate analysis, predictors of events were the ejection fraction (p <0.001), male gender (p <0.001), previous myocardial infarction (p <0.001), angiotensin-converting enzyme inhibitor therapy (p = 0.021), calcium channel blocker therapy (p = 0.034), and abnormal results on EA-DSE (p <0.001). On multivariate analysis, the independent predictors of events were male gender (relative risk [RR] 1.78, 95% confidence interval [CI] 1.13 to 2.81, p = 0.013) and abnormal results on EA-DSE (RR 4.45, 95% CI 2.84 to 7.01, p <0.0001). Normal results on EA-DSE with P blockers were associated with a nonsignificant higher incidence of events than normal results on EA-DSE without beta blockers (RR 1.29, 95% CI 0.58 to 2.87, p = 0.54). Abnormal results on EA-DSE with beta blockers had an RR of 4.97 (95% CI 2.79 to 8.87, p <0.001) compared with normal results, while abnormal results on EA-DSE without beta blockers had an RR of 5.96 (95% CI 3.41 to 10.44, p <0.001) for events, with no difference between groups (p = 0.36). In conclusion, the detection of fixed or inducible wall motion abnormalities during EA-DSE was an independent predictor of long-term events in patients with known or suspected CAD. The prognostic value of EA-DSE was not affected by the long-term use of beta blockers. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1291-1295)
Resumo:
Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Recently, stress myocardial computed tomographic perfusion (CTP) was shown to detect myocardial ischemia. Our main objective was to evaluate the feasibility of dipyridamole stress CTP and compare it to single-photon emission computed tomography (SPECT) to detect significant coronary stenosis using invasive conventional coronary angiography (CCA; stenosis >70%) as the reference method. Thirty-six patients (62 +/- 8 years old, 20 men) with previous positive results with SPECT (<2 months) as the primary inclusion criterion and suspected coronary artery disease underwent a customized multidetector-row CT protocol with myocardial perfusion evaluation at rest and during stress and coronary CT angiography (CTA). Multidetector-row computed tomography was performed in a 64-slice scanner with dipyridamole stress perfusion acquisition before a second perfusion/CT angiographic acquisition at rest. Independent blinded observers performed analysis of images from CTP, CTA, and CCA. All 36 patients completed the CT protocol with no adverse events (mean radiation dose 14.7 +/- 3.0 mSv) and with interpretable scans. CTP results were positive in 27 of 36 patients (75%). From the 9 (25%) disagreements, 6 patients had normal coronary arteries and 2 had no significant stenosis (8 false-positive results with SPECT, 22%). The remaining patient had an occluded artery with collateral flow confirmed by conventional coronary angiogram. Good agreement was demonstrated between CTP and SPECT on a per-patient analysis (kappa 0.53). In 26 patients using CCA as reference, sensitivity, specificity, and positive and negative predictive values were 88.0%, 79.3%, 66.7%, and 93.3% for CTP and 68.8, 76.1%, 66.7%, and 77.8%, for SPECT, respectively (p = NS). In conclusion, dipyridamole CT myocardial perfusion at rest and during stress is feasible and results are similar to single-photon emission CT scintigraphy. The anatomical-perfusion information provided by this combined CT protocol may allow identification of false-positive results by SPECT. (C) 2010 Elsevier Inc. All rights reserved. (Am J Cardiol 2010;106:310-315)
Resumo:
BACKGROUND: The arterial pulse pressure variation induced by mechanical ventilation (Delta PP) has been shown to be a predictor of fluid responsiveness. Until now, Delta PP has had to be calculated offline (from a computer recording or a paper printing of the arterial pressure curve), or to be derived from specific cardiac output monitors, limiting the widespread use of this parameter. Recently, a method has been developed for the automatic calculation and real-time monitoring of Delta PP using standard bedside monitors. Whether this method is to predict reliable predictor of fluid responsiveness remains to be determined. METHODS: We conducted a prospective clinical study in 59 mechanically ventilated patients in the postoperative period of cardiac surgery. Patients studied were considered at low risk for complications related to fluid administration (pulmonary artery occlusion pressure <20 mm Hg, left ventricular ejection fraction >= 40%). All patients were instrumented with an arterial line and a pulmonary artery catheter. Cardiac filling pressures and cardiac output were measured before and after intravascular fluid administration (20 mL/kg of lactated Ringer`s solution over 20 min), whereas Delta PP was automatically calculated and continuously monitored. RESULTS: Fluid administration increased cardiac output by at least 15% in 39 patients (66% = responders). Before fluid administration, responders and nonresponders were comparable with regard to right atrial and pulmonary artery occlusion pressures. In contrast, Delta PP was significantly greater in responders than in nonresponders, (17% +/- 3% vs 9% +/- 2%, P < 0.001). The Delta PP cut-off value of 12% allowed identification of responders with a sensitivity of 97% and a specificity of 95%. CONCLUSION: Automatic real-time monitoring of Delta PP is possible using a standard bedside rnonitor and was found to be a reliable method to predict fluid responsiveness after cardiac surgery. Additional studies are needed to determine if this technique can be used to avoid the complications of fluid administration in high-risk patients.
Resumo:
Context Perioperative red blood cell transfusion is commonly used to address anemia, an independent risk factor for morbidity and mortality after cardiac operations; however, evidence regarding optimal blood transfusion practice in patients undergoing cardiac surgery is lacking. Objective To define whether a restrictive perioperative red blood cell transfusion strategy is as safe as a liberal strategy in patients undergoing elective cardiac surgery. Design, Setting, and Patients The Transfusion Requirements After Cardiac Surgery (TRACS) study, a prospective, randomized, controlled clinical noninferiority trial conducted between February 2009 and February 2010 in an intensive care unit at a university hospital cardiac surgery referral center in Brazil. Consecutive adult patients (n=502) who underwent cardiac surgery with cardiopulmonary bypass were eligible; analysis was by intention-to-treat. Intervention Patients were randomly assigned to a liberal strategy of blood transfusion (to maintain a hematocrit >= 30%) or to a restrictive strategy (hematocrit >= 24%). Main Outcome Measure Composite end point of 30-day all-cause mortality and severe morbidity (cardiogenic shock, acute respiratory distress syndrome, or acute renal injury requiring dialysis or hemofiltration) occurring during the hospital stay. The noninferiority margin was predefined at -8% (ie, 8% minimal clinically important increase in occurrence of the composite end point). Results Hemoglobin concentrations were maintained at a mean of 10.5 g/dL(95% confidence interval [CI], 10.4-10.6) in the liberal-strategy group and 9.1 g/dL (95% CI, 9.09.2) in the restrictive-strategy group (P<.001). A total of 198 of 253 patients (78%) in the liberal-strategy group and 118 of 249 (47%) in the restrictive-strategy group received a blood transfusion (P<.001). Occurrence of the primary end point was similar between groups (10% liberal vs 11% restrictive; between-group difference, 1% [95% CI, -6% to 4%]; P=.85). Independent of transfusion strategy, the number of transfused red blood cell units was an independent risk factor for clinical complications or death at 30 days (hazard ratio for each additional unit transfused, 1.2 [95% CI, 1.1-1.4]; P=.002). Conclusion Among patients undergoing cardiac surgery, the use of a restrictive perioperative transfusion strategy compared with a more liberal strategy resulted in noninferior rates of the combined outcome of 30-day all-cause mortality and severe morbidity. Trial Registration clinicaltrials.gov Identifier: NCT01021631 JAMA. 2010; 304(14):1559-1567 www.jama.com