192 resultados para Framingham heart study
Resumo:
OBJECTIVES: The treatment of recurrent rejection in heart transplant recipients has been a controversial issue for many years. The intent of this retrospective study was to perform a risk-benefit analysis between treatment strategies with bolus steroids only versus anti-thymocyte globulins (RATG; 1.5 mg/kg q 4 days). METHODS: Between 1986 and 1993, 69 of 425 patients (17 male, 52 female; mean age 44 +/- 11 years) who had more than one rejection/patient per month (rej/pt per mo) in the first 3 postoperative months were defined as recurrent rejectors. RESULTS: Repetitive methylprednisolone bolus therapy (70 mg/kg q 3 days) was given in 27 patients (group M; 1.4 +/- 0.2 rej/pt per mo) and RATG therapy for one of the rejection episodes of the 42 remaining patients (group A; 1.5 +/- 0.2 rej/pt per mo). The quality of triple drug immunosuppression in the two study groups was comparable. The rejection-free interval (RFI) following RATG treatment in group A was 21.6 +/- 10 days and 22 +/- 11 in group M. In group M, 3 of 27 patients (11%) had a rejection treatment-related infection (2 bacterial; 1 viral) versus 6 of the 42 patients of group A (14.2%; bacterial 1, viral 5). During postoperative months 3-24, 0.15 +/- 0.12 rej/pat per mo were observed in group M and 0.21 +/- 0.13 rej/pat per mo in group A (n.s.). In this 21-month period cytolytic therapy for rejection was initiated in 8 of the remaining 21 patients of group M (38%) and 15 of the remaining 37 patients of group A (40.5%). The absolute survival and the individual causes of death were not affected by the type of initial treatment of recurrent rejection. The actuarial freedom of graft atherosclerosis is comparable in the two groups with 78% in group A versus 79% in group M free of graft atherosclerosis at 3 years postoperatively. CONCLUSIONS: A comparison of cytolytic therapy versus repeated applications of bolus steroids for treatment of recurrent rejection reveals no significant difference in the long-term patient outcome with respect to the incidence of future rejection episodes and survival.
Resumo:
The endomyocardial biopsy (EMB) in heart transplant recipients has been considered the "gold standard" for diagnosis of graft rejection (REJ). The purpose of this retrospective study is to develop long-term strategies (frequency and postoperative duration of EMB) for REJ monitoring. Between 1985 and 1992, 346 patients (mean age 44.5 years, female patients = 14%) received 382 heart grafts. For graft surveillance EMBs were performed according to a fixed schedule depending on postoperative day and the results of previous biopsies. In the first year the average number (no.) of EMBs/patient was 20 with 19% positive for REJ in the first quarter, dropping to 7% REJ/EMB by the end of the first year. The percentage of REJ/EMB declined annually from 4.7% to 4.5%, 2.2% and less than 1% after the fifth year. Individual biopsy results in the first 3 postoperative months had little predictive value. Patients with fewer than two REJ (group 1), vs patients with two or more REJ in the first 6 postoperative months (group 2), were significantly less likely to reject in the second half of the first year (group 1: 0.29 +/- 0.6 REJ/patient; group 2:0.83 +/- 1.3 REJ/patient; P < 0.001) and third postoperative year (group 1:0.12 +/- 0.33 REJ/patients; group 2:0.46 +/- 0.93 REJ/patient; P < 0.05). In conclusion, routine EMBs in the first 3 postoperative months have only limited predictive value, however the number of routine EMBs can be drastically reduced later depending on the intermediate postoperative REJ pattern.
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Prolongation of the safe period of ischemia of the heart is an efficient way to overcome donor organ shortage, as demonstrated in renal and hepatic transplantation. We present the results of a prospective, randomized study comparing preservation with University of Wisconsin solution (UWS) versus St. Thomas' Hospital solution (STS) in clinical heart transplantation. A total of 39 patients were enrolled in the study (n = 20 for UWS and n = 19 for STS). Hemodynamic, electron microscopic, and biochemical evaluation did not reveal any significant differences in postoperative myocardial performance. Only the number of intraoperative defibrillations (0.82 for UWS versus 1.7 for STS) and the rhythm stability after reperfusion (13/20 UWS hearts versus 6/19 STS hearts in sinus rhythm) were significantly different. Heart preservation with UWS and STS appears to be of comparable efficacy at mean ischemic times of less than 4 hours.
Resumo:
This study evaluates the clinical applicability of administering sodium nitroprusside by a closed-loop titration system compared with a manually adjusted system. The mean arterial pressure (MAP) was registered every 10 and 30 sec during the first 150 min after open heart surgery in 20 patients (group 1: computer regulation) and in ten patients (group 2: manual regulation). The results (16,343 and 2,912 data points in groups 1 and 2, respectively), were then analyzed in four time frames and five pressure ranges to indicate clinical efficacy. Sixty percent of the measured MAP in both groups was within the desired +/- 10% during the first 10 min. Thereafter until the end of observation, the MAP was maintained within +/- 10% of the desired set-point 90% of the time in group 1 vs. 60% of the time in group 2. One percent and 11% of data points were +/- 20% from the set-point in groups 1 and 2, respectively (p less than .05, chi-square test). The computer-assisted therapy provided better control of MAP, was safe to use, and helped to reduce nursing demands.
Resumo:
Eotaxin/CCL11 chemokine is expressed in different organs, including the heart, but its precise cellular origin in the heart is unknown. Eotaxin is associated with Th2-like responses and exerts its chemotactic effect through the chemokine receptor-3 (CCR3), which is also expressed on mast cells (MC). The aim of our study was to find the cellular origin of eotaxin in the heart, and to assess whether expression is changing during ongoing acute heart transplant rejection, indicating a correlation with mast cell infiltration which we observed in a previous study. In a model of ongoing acute heart transplant rejection in the rat, we found eotaxin mRNA expression within infiltrating macrophages, but not in mast cells, by in situ-hybridization. A five-fold increase in eotaxin protein in rat heart transplants during ongoing acute rejection was measured on day 28 after transplantation, compared to native and isogeneic control hearts. Eotaxin concentrations in donor hearts on day 28 after transplantation were significantly higher compared to recipient hearts, corroborating an origin of eotaxin from cells within the heart, and not from the blood. The quantitative comparison of eotaxin mRNA expression between native hearts, isografts, and allografts, respectively, revealed no statistically significant difference after transplantation, probably due to an overall increase in the housekeeping gene's 18S rRNA during rejection. Quantitative RT-PCR showed an increase in mRNA expression of CCR3, the receptor for eotaxin, during ongoing acute rejection of rat heart allografts. Although a correlation between increasing eotaxin expression by macrophages and mast cell infiltration is suggestive, functional studies will elucidate the role of eotaxin in the process of ongoing acute heart transplant rejection.
Resumo:
Heart failure is a serious condition and equivalent to malignant disease in terms of symptom burden and mortality. At this moment only a comparatively small number of heart failure patients receive specialist palliative care. Heart failure patients may have generic palliative care needs, such as refractory multifaceted symptoms, communication and decision making issues and the requirement for family support. The Advanced Heart Failure Study Group of the Heart Failure Association of the European Society of Cardiology organized a workshop to address the issue of palliative care in heart failure to increase awareness of the need for palliative care. Additional objectives included improving the accessibility and quality of palliative care for heart failure patients and promoting the development of heart failure-orientated palliative care services across Europe. This document represents a synthesis of the presentations and discussion during the workshop and describes recommendations in the area of delivery of quality care to patients and families, education, treatment coordination, research and policy.
Resumo:
BACKGROUND: Exercise capacity after heart transplantation (HTx) remains limited despite normal left ventricular systolic function of the allograft. Various clinical and haemodynamic parameters are predictive of exercise capacity following HTx. However, the predictive significance of chronotropic competence has not been demonstrated unequivocally despite its immediate relevance for cardiac output. AIMS: This study assesses the predictive value of various clinical and haemodynamic parameters for exercise capacity in HTx recipients with complete chronotropic competence evolving within the first 6 postoperative months. METHODS: 51 patients were enrolled in this exercise study. Patients were included when at least >6 months after HTx and without negative chronotropic medication or factors limiting exercise capacity such as significant transplant vasculopathy or allograft rejection. Clinical parameters were obtained by chart review, haemodynamic parameters from current cardiac catheterisation, and exercise capacity was assessed by treadmill stress testing. A stepwise multiple regression model analysed the proportion of the variance explained by the predictive parameters. RESULTS: The mean age of these 51 HTx recipients was 55.4 +/- 13.2 yrs on inclusion, 42 pts were male and the mean time interval after cardiac transplantation was 5.1 +/- 2.8 yrs. Five independent predictors explained 47.5% of the variance observed for peak exercise capacity (adjusted R2 = 0.475). In detail, heart rate response explained 31.6%, male gender 5.2%, age 4.1%, pulmonary vascular resistance 3.7%, and body-mass index 2.9%. CONCLUSION: Heart rate response is one of the most important predictors of exercise capacity in HTx recipients with complete chronotropic competence and without relevant transplant vasculopathy or acute allograft rejection.
Resumo:
BACKGROUND: Myocardial contrast echocardiography (MCE) is able to measure in vivo relative blood volume (rBV, i.e., capillary density), and its exchange frequency b, the constituents of myo-cardial blood flow (MBF, ml min-1 g-1). This study aimed to assess, by MCE, whether left ventricular hypertrophy (LVH) in hypertrophic cardiomyopathy (HCM) can be differentiated from LVH in triathletes (athlete's heart, AH) or from hypertensive heart disease patients (HHD). METHODS: Sixty individuals, matched for age (33 +/- 10 years) and gender, and subdivided into four groups (n = 15) were examined: HCM, AH, HHD and a group of sedentary individuals without LVH (S). rBV (ml ml-1), b (min-1) and MBF, at rest and during adenosine-induced hyperaemia, were derived by MCE in mid septal, lateral and inferior regions. The ratio of MBF during hyperaemia and MBF at rest yielded myocardial blood flow reserve (MBFR). RESULTS: Septal wall rBV at rest was lower in HCM (0.084 +/- 0.023 ml ml-1) than in AH (0.151 +/- 0.024 ml ml-1, p <0.01) and in S (0.129 +/- 0.026 ml ml-1, p <0.01), but was similar to HHD (0.097 +/- 0.016 ml ml-1). Conversely, MBFR was lowest in HCM (1.67 +/- 0.93), followed by HHD (2.8 +/- 0.93, p <0.01), by S (3.36 +/- 1.03, p <0.001) and by AH (4.74 +/- 1.46, p <0.0001). At rest, rBV <0.11 ml ml-1 accurately distinguished between HCM and AH (sensitivity 99%, specificity 99%), similarly MBFR < or =1.8 helped to distinguish between HCM and HHD (sensitivity 100%, specificity 77%). CONCLUSIONS: rBV at rest, most accurately distinguishes between pathological LVH due to HCM and physiological, endurance-exercise induced LVH.
Resumo:
CONTEXT: It is uncertain whether intensified heart failure therapy guided by N-terminal brain natriuretic peptide (BNP) is superior to symptom-guided therapy. OBJECTIVE: To compare 18-month outcomes of N-terminal BNP-guided vs symptom-guided heart failure therapy. DESIGN, SETTING, AND PATIENTS: Randomized controlled multicenter Trial of Intensified vs Standard Medical Therapy in Elderly Patients With Congestive Heart Failure (TIME-CHF) of 499 patients aged 60 years or older with systolic heart failure (ejection fraction < or = 45%), New York Heart Association (NYHA) class of II or greater, prior hospitalization for heart failure within 1 year, and N-terminal BNP level of 2 or more times the upper limit of normal. The study had an 18-month follow-up and it was conducted at 15 outpatient centers in Switzerland and Germany between January 2003 and June 2008. INTERVENTION: Uptitration of guideline-based treatments to reduce symptoms to NYHA class of II or less (symptom-guided therapy) and BNP level of 2 times or less the upper limit of normal and symptoms to NYHA class of II or less (BNP-guided therapy). MAIN OUTCOME MEASURES: Primary outcomes were 18-month survival free of all-cause hospitalizations and quality of life as assessed by structured validated questionnaires. RESULTS: Heart failure therapy guided by N-terminal BNP and symptom-guided therapy resulted in similar rates of survival free of all-cause hospitalizations (41% vs 40%, respectively; hazard ratio [HR], 0.91 [95% CI, 0.72-1.14]; P = .39). Patients' quality-of-life metrics improved over 18 months of follow-up but these improvements were similar in both the N-terminal BNP-guided and symptom-guided strategies. Compared with the symptom-guided group, survival free of hospitalization for heart failure, a secondary end point, was higher among those in the N-terminal BNP-guided group (72% vs 62%, respectively; HR, 0.68 [95% CI, 0.50-0.92]; P = .01). Heart failure therapy guided by N-terminal BNP improved outcomes in patients aged 60 to 75 years but not in those aged 75 years or older (P < .02 for interaction) CONCLUSION: Heart failure therapy guided by N-terminal BNP did not improve overall clinical outcomes or quality of life compared with symptom-guided treatment. TRIAL REGISTRATION: isrctn.org Identifier: ISRCTN43596477.
Resumo:
BACKGROUND: Due to better early and long-term outcome, the increasing population of grown-ups with congenital heart disease (GUCH) brings up unexpected quality of life (QoL) issues. The cardiac lesion by itself is not always the major problem for these patients, since issues pertaining to QoL and psychosocial aspects often predominate. This study analyses the QoL of GUCH patients after cardiac surgery and the possible impact of medical and psychosocial complications. PATIENTS AND METHODS: A questionnaire package containing the SF-36 health survey (health related QoL), the HADS test (anxiety/depression aspects) and an additional disease specific questionnaire was sent to 345 patients (mean 26+/-11 years) operated for isolated transposition of the great arteries (TGA), tetralogy of Fallot (TOF), and ventricular septal defect (VSD). The scores were compared with age- and gender-matched standard population data and in relation to the underlying congenital heart disease (CHD). RESULTS: In all SF-36 and HADS health dimensions the GUCH patients showed excellent scores (116+/-20), which are comparable to the standard population (100+/-15), regardless of the initial CHD (p=0.12). Eighty-two percent of the patients were found to be in NYHA class I and 83% patients declared that they do not consider their QoL to be limited by their malformation. Complications like reoperations (p=0.21) and arrhythmias (p=0.10) do not show significant impact on the QoL. The additional questionnaire revealed that 76% of adult patients have a fulltime job, 18% receive a full or partial disability pension, 21% reported problems with insurances, most of them regarding health insurances (67%), and 4.4% of adult patients declared to have renounced the idea of having children due to their cardiac malformation. CONCLUSION: QoL in GUCH patients following surgical repair of isolated TOF, TGA and VSD is excellent and comparable to standard population, this without significant difference between the diagnosis groups. However, these patients are exposed to a high rate of complications and special psychosocial problems, which are not assessed by standardized questionnaires, such as the SF-36 and HADS. These findings highlight the great importance for a multidisciplinary and specialized follow-up for an adequate management of these complex patients.
Resumo:
BACKGROUND: Surfactant protein type B (SPB) is needed for alveolar gas exchange. SPB is increased in the plasma of patients with heart failure (HF), with a concentration that is higher when HF severity is highest. The aim of this study was to evaluate the relationship between plasma SPB and both alveolar-capillary diffusion at rest and ventilation versus carbon dioxide production during exercise. METHODS AND RESULTS: Eighty patients with chronic HF and 20 healthy controls were evaluated consecutively, but the required quality for procedures was only reached by 71 patients with HF and 19 healthy controls. Each subject underwent pulmonary function measurements, including lung diffusion for carbon monoxide and membrane diffusion capacity, and maximal cardiopulmonary exercise test. Plasma SPB was measured by immunoblotting. In patients with HF, SPB values were higher (4.5 [11.1] versus 1.6 [2.9], P=0.0006, median and 25th to 75th interquartile), whereas lung diffusion for carbon monoxide (19.7+/-4.5 versus 24.6+/-6.8 mL/mm Hg per min, P<0.0001, mean+/-SD) and membrane diffusion capacity (28.9+/-7.4 versus 38.7+/-14.8, P<0.0001) were lower. Peak oxygen consumption and ventilation/carbon dioxide production slope were 16.2+/-4.3 versus 26.8+/-6.2 mL/kg per min (P<0.0001) and 29.7+/-5.9 and 24.5+/-3.2 (P<0.0001) in HF and controls, respectively. In the HF population, univariate analysis showed a significant relationship between plasma SPB and lung diffusion for carbon monoxide, membrane diffusion capacity, peak oxygen consumption, and ventilation/carbon dioxide production slope (P<0.0001 for all). On multivariable logistic regression analysis, membrane diffusion capacity (beta, -0.54; SE, 0.018; P<0.0001), peak oxygen consumption (beta, -0.53; SE, 0.036; P=0.004), and ventilation/carbon dioxide production slope (beta, 0.25; SE, 0.026; P=0.034) were independently associated with SPB. CONCLUSIONS: Circulating plasma SPB levels are related to alveolar gas diffusion, overall exercise performance, and efficiency of ventilation showing a link between alveolar-capillary barrier damage, gas exchange abnormalities, and exercise performance in HF.
Resumo:
OBJECTIVES: Respiratory syncytial virus (RSV) infections are a leading cause of hospital admissions in small children. A substantial proportion of these patients require medical and nursing care, which can only be provided in intermediate (IMC) or intensive care units (ICU). This article reports on all children aged < 3 years who required admission to IMC and/or ICU between October 1, 2001 and September 30, 2005 in Switzerland. PATIENTS AND METHODS: We prospectively collected data on all children aged < 3 years who were admitted to an IMC or ICU for an RSV-related illness. Using a detailed questionnaire, we collected information on risk factors, therapy requirements, length of stay in the IMC/ICU and hospital, and outcome. RESULTS: Of the 577 cases reported during the study period, 90 were excluded because the patients did not fulfill the inclusion criteria; data were incomplete in another 25 cases (5%). Therefore, a total of 462 verified cases were eligible for analysis. At the time of hospital admission, only 31 patients (11%) were older than 12 months. Since RSV infection was not the main reason for IMC/ICU admission in 52% of these patients, we chose to exclude this subgroup from further analyses. Among the 431 infants aged < 12 months, the majority (77%) were former near term or full term (NT/FT) infants with a gestational age > or = 35 weeks without additional risk factors who were hospitalized at a median age of 1.5 months. Gestational age (GA) < 32 weeks, moderate to severe bronchopulmonary dysplasia (BPD), and congenital heart disease (CHD) were all associated with a significant risk increase for IMC/ICU admission (relative risk 14, 56, and 10, for GA < or = 32 weeks, BPD, and CHD, respectively). Compared with NT/FT infants, high-risk infants were hospitalized at an older age (except for infants with CHD), required more invasive and longer respiratory support, and had longer stays in the IMC/ICU and hospital. CONCLUSIONS: In Switzerland, RSV infections lead to the IMC/ICU admission of approximately 1%-2% of each annual birth cohort. Although prematurity, BPD, and CHD are significant risk factors, non-pharmacological preventive strategies should not be restricted to these high-risk patients but also target young NT/FT infants since they constitute 77% of infants requiring IMC/ICU admission.
Resumo:
The purpose of this study was to assess bone mineral density (BMD) and parameters for bone metabolism in patients with end-stage heart disease awaiting heart transplantation to determine whether these patients are at increased risk of bone disease.
Resumo:
Aim: Increased rates of hospitalization due to cardiovascular events have been reported during phases of World Soccer Championships (WSC). The purpose of this pilot study was to explore acute psychological and physiological effects of watching a live broadcast soccer game during the WSC 2006. Methods: Seven male supporters (age: M=24; SD=2.7) of the Swiss National Soccer Team watched a game of their team in a controlled laboratory setting. Heart rate (HR), heart rate variability (HRV), salivary cortisol, alpha-amylase (sAA), and testosterone concentrations, as well as several mood ratings were captured repeatedly before, during, and after the game. Results: Subjects reported feeling stressed, and HR and sAA activity showed an increase during the game. In contrast, HRV, cortisol and testosterone were unaffected. Conclusion: Watching a sports competition seems to specifically affect the sympathetic nervous system, which can be measured by sensitive electrocardiographic and salivary markers.