886 resultados para TRANSPLANT RECIPIENTS
Resumo:
To evaluate the metabolic consequences of pancreas transplantation with systemic venous drainage on beta-cell function, we examined insulin and C-peptide responses to glucose and arginine in type I (insulin-dependent) diabetic pancreas recipients (n = 30), nondiabetic kidney recipients (n = 8), and nondiabetic control subjects (n = 28). Basal insulin levels were 66 +/- 5 pM in control subjects, 204 +/- 18 pM in pancreas recipients (P less than 0.0001 vs. control), and 77 +/- 17 pM in kidney recipients. Acute insulin responses to glucose were 416 +/- 44 pM in control subjects, 763 +/- 91 pM in pancreas recipients (P less than 0.01 vs. control), and 589 +/- 113 pM in kidney recipients (NS vs. control). Basal and stimulated insulin levels in two pancreas recipients with portal venous drainage were normal. Integrated acute C-peptide responses were not statistically different (25.3 +/- 4.3 nM/min in pancreas recipients, 34.2 +/- 5.5 nM/min in kidney recipients, and 23.7 +/- 2.1 nM/min in control subjects). Similar insulin and C-peptide results were obtained with arginine stimulation, and both basal and glucose-stimulated insulin-C-peptide ratios in pancreas recipients were significantly greater than in control subjects. We conclude that recipients of pancreas allografts with systemic venous drainage have elevated basal and stimulated insulin levels and that these alterations are primarily due to alterations of first-pass hepatic insulin clearance, although insulin resistance secondary to immunosuppressive therapy (including prednisone) probably plays a contributing role. To avoid hyperinsulinemia and its possible long-term adverse consequences, transplantation of pancreas allografts into sites with portal rather than systemic venous drainage should be considered.
Resumo:
To ascertain the consequences of pancreas transplantation with systemic venous drainage on glucose homeostasis and insulin secretion, glucose and insulin responses to intravenous glucose were compared in 10 recipients and 15 normal control subjects. There were no differences in fasting glucose levels or intravenous glucose disappearance rates. However, basal insulin levels and acute insulin responses to glucose were threefold greater in the recipients. It is not clear whether this consequence of hyperinsulinemia in the recipients is due to the abnormal circulatory drainage, the lack of autonomic input, or concurrent immunosuppressive drug therapy.
Resumo:
BACKGROUND: Many studies confirm that noncompliance or poor compliance is one of the great problems in health care as it results in waste of resources and funds. METHODS: This overview includes literature on heart, liver, and kidney transplants with emphasis on heart transplantation in adult and pediatric transplant patients and addresses the following variables as potential predictors of postoperative compliance problems: demographic variables (age, marital status, gender) psychological variables (anxiety, denial) psychiatric disorders (major depression, anxiety, and personality disorders), poor social support, pretransplant noncompliance, obesity, substance abuse, and health-related variables (distance from transplant center, indication for transplantation, required pretransplant assist device). Relevant studies on these topics that were conducted up to 1999 are included and discussed in this overview. The most important results are presented in tables. RESULTS: Unfortunately, there has not been any systematic and comprehensive review of the literature on predictors of noncompliance in organ transplant patients so far. With organ transplantation noncompliance impairs both life quality and life span as it is a major risk factor for graft rejection episodes and is responsible for up to 25% of deaths after the initial recovery period. Therefore, it might be assumed that well-informed transplant patients are a highly motivated group whose compliance is just as high. This is not the case. However, even when graft loss means loss of life as in heart or liver transplantation, noncompliance occurs. To best select potential organ recipients, it would be ideal if patients who are very likely to show noncompliant behavior could be identified already before being transplanted. CONCLUSION: The literature overview shows the necessity of preoperative psychosocial screening regarding predictors for posttransplant noncompliance.
Resumo:
PURPOSE: To report 2 cases of exogenous Candida glabrata endophthalmitis after penetrating keratoplasty in recipients of corneas from the same donor transplanted on the same day. METHODS: Case reports with ophthalmologic, electron microscopic, and microbiological findings including fungal strain analysis. RESULTS: Two patients developed fungal keratitis and endophthalmitis caused by the same C. glabrata strain within 1 day after penetrating keratoplasty of corneas from the same donor on the same day. Donor-to-host transmission was postulated when eye bank sterility checks were repeatedly negative. CONCLUSIONS: A short death-to-harvesting time, routine donor rim cultures, and respecting of a time interval before transplantation may provide an additional safety feature in dealing with corneal tissue from high-risk donors.
Resumo:
Ischemia/reperfusion injury leads to activation of graft endothelial cells (EC), boosting antigraft immunity and impeding tolerance induction. We hypothesized that the complement inhibitor and EC-protectant dextran sulfate (DXS, MW 5000) facilitates long-term graft survival induced by non-depleting anti-CD4 mAb (RIB 5/2). Hearts from DA donor rats were heterotopically transplanted into Lewis recipients treated with RIB 5/2 (20 mg/kg, days-1,0,1,2,3; i.p.) with or without DXS (grafts perfused with 25 mg, recipients treated i.v. with 25 mg/kg on days 1,3 and 12.5 mg/kg on days 5,7,9,11,13,15). Cold graft ischemia time was 20 min or 12 h. Median survival time (MST) was comparable between RIB 5/2 and RIB 5/2+DXS-treated recipients in the 20-min group with >175-day graft survival. In the 12-h group RIB 5/2 only led to chronic rejection (MST = 49.5 days) with elevated alloantibody response, whereas RIB 5/2+DXS induced long-term survival (MST >100 days, p < 0.05) with upregulation of genes related to transplantation tolerance. Analysis of the 12-h group treated with RIB 5/2+DXS at 1-day posttransplantation revealed reduced EC activation, complement deposition and inflammatory cell infiltration. In summary, DXS attenuates I/R-induced acute graft injury and facilitates long-term survival in this clinically relevant transplant model.
Resumo:
1. Habitat fragmentation and variation in habitat quality can both affect plant performance, but their effects have rarely been studied in combination. We thus examined plant performance in response to differences in habitat quality for a species subject to habitat fragmentation, the common but declining perennial herb Lychnis flos-cuculi. 2. We reciprocally transplanted plants between 15 fen grasslands in north-east Switzerland and recorded plant performance for 4 years. 3. Variation between the 15 target sites was the most important factor and affected all measures of plant performance in all years. This demonstrates the importance of plastic responses to habitat quality for plant performance. 4. Plants from smaller populations produced fewer rosettes than plants from larger populations in the first year of the replant-transplant experiment. 5. Plant performance decreased with increasing ecological difference between grassland of origin and target grassland, indicating adaptation to ecological conditions. In contrast, plant performance was not influenced by microsatellite distance and hardly by geographic distance between grassland of origin and target grassland. 6. Plants originating from larger populations were better able to cope with larger ecological differences between transplantation site and site of origin. 7. Synthesis: In addition to the direct effects of target grasslands, both habitat fragmentation, through reduced population size, and adaptation to habitats of different quality, contributed to the performance of L. flos-cuculi. This underlines that habitat fragmentation also affects species that are still common. Moreover, it suggests that restoration projects involving L. flos-cuculi should use plant material from large populations living in habitats similar to the restoration site. Finally, our results bring into question whether plants in small habitat remnants will be able to cope with future environmental change.
Resumo:
BACKGROUND: Exercise capacity after heart transplantation (HTx) remains limited despite normal left ventricular systolic function of the allograft. Various clinical and haemodynamic parameters are predictive of exercise capacity following HTx. However, the predictive significance of chronotropic competence has not been demonstrated unequivocally despite its immediate relevance for cardiac output. AIMS: This study assesses the predictive value of various clinical and haemodynamic parameters for exercise capacity in HTx recipients with complete chronotropic competence evolving within the first 6 postoperative months. METHODS: 51 patients were enrolled in this exercise study. Patients were included when at least >6 months after HTx and without negative chronotropic medication or factors limiting exercise capacity such as significant transplant vasculopathy or allograft rejection. Clinical parameters were obtained by chart review, haemodynamic parameters from current cardiac catheterisation, and exercise capacity was assessed by treadmill stress testing. A stepwise multiple regression model analysed the proportion of the variance explained by the predictive parameters. RESULTS: The mean age of these 51 HTx recipients was 55.4 +/- 13.2 yrs on inclusion, 42 pts were male and the mean time interval after cardiac transplantation was 5.1 +/- 2.8 yrs. Five independent predictors explained 47.5% of the variance observed for peak exercise capacity (adjusted R2 = 0.475). In detail, heart rate response explained 31.6%, male gender 5.2%, age 4.1%, pulmonary vascular resistance 3.7%, and body-mass index 2.9%. CONCLUSION: Heart rate response is one of the most important predictors of exercise capacity in HTx recipients with complete chronotropic competence and without relevant transplant vasculopathy or acute allograft rejection.
Resumo:
BACKGROUND: Peak oxygen uptake (peak Vo(2)) is an established integrative measurement of maximal exercise capacity in cardiovascular disease. After heart transplantation (HTx) peak Vo(2) remains reduced despite normal systolic left ventricular function, which highlights the relevance of diastolic function. In this study we aim to characterize the predictive significance of cardiac allograft diastolic function for peak Vo(2). METHODS: Peak Vo(2) was measured using a ramp protocol on a bicycle ergometer. Left ventricular (LV) diastolic function was assessed with tissue Doppler imaging sizing the velocity of the early (Ea) and late (Aa) apical movement of the mitral annulus, and conventional Doppler measuring early (E) and late (A) diastolic transmitral flow propagation. Correlation coefficients were calculated and linear regression models fitted. RESULTS: The post-transplant time interval of the 39 HTxs ranged from 0.4 to 20.1 years. The mean age of the recipients was 55 +/- 14 years and body mass index (BMI) was 25.4 +/- 3.9 kg/m(2). Mean LV ejection fraction was 62 +/- 4%, mean LV mass index 108 +/- 22 g/m(2) and mean peak Vo(2) 20.1 +/- 6.3 ml/kg/min. Peak Vo(2) was reduced in patients with more severe diastolic dysfunction (pseudonormal or restrictive transmitral inflow pattern), or when E/Ea was > or =10. Peak Vo(2) correlated with recipient age (r = -0.643, p < 0.001), peak heart rate (r = 0.616, p < 0.001) and BMI (r = -0.417, p = 0.008). Of all echocardiographic measurements, Ea (r = 0.561, p < 0.001) and Ea/Aa (r = 0.495, p = 0.002) correlated best. Multivariate analysis identified age, heart rate, BMI and Ea/Aa as independent predictors of peak Vo(2). CONCLUSIONS: Diastolic dysfunction is relevant for the limitation of maximal exercise capacity after HTx.
Resumo:
BACKGROUND: Renal resistance index, a predictor of kidney allograft function and patient survival, seems to depend on renal and peripheral vascular compliance and resistance. Asymmetric dimethylarginine (ADMA) is an endogenous inhibitor of nitric oxide synthase and therefore influences vascular resistance. STUDY DESIGN: We investigated the relationship between renal resistance index, ADMA, and risk factors for cardiovascular diseases and kidney function in a cross-sectional study. SETTING ; PARTICIPANTS: 200 stable renal allograft recipients (133 men and 67 women with a mean age of 52.8 years). PREDICTORS: Serum ADMA concentration, pulse pressure, estimated glomerular filtration rate and recipient age. OUTCOME: Renal resistance index. MEASUREMENTS: Renal resistance index measured by color-coded duplex ultrasound, serum ADMA concentration measured by liquid chromatography-tandem mass spectrometry, estimated glomerular filtration rate (Nankivell equation), arterial stiffness measured by digital volume pulse, Framingham and other cardiovascular risk factors, and evaluation of concomitant antihypertensive and immunosuppressive medication. RESULTS: Mean serum ADMA concentration was 0.72 +/- 0.21 (+/-SD) micromol/L and mean renal resistance index was 0.71 +/- 0.07. Multiple stepwise regression analysis showed that recipient age (P < 0.001), pulse pressure (P < 0.001), diabetes (P < 0.01) and ADMA concentration (P < 0.01) were independently associated with resistance index. ADMA concentrations were correlated with estimated glomerular filtration rate (P < 0.01). LIMITATIONS: The cross-sectional nature of this study precludes cause-effect conclusions. CONCLUSIONS: In addition to established cardiovascular risk factors, ADMA appears to be a relevant determinant of renal resistance index and allograft function and deserves consideration in prospective outcome trials in renal transplantation.
Resumo:
OBJECTIVE: Nursing in 'live islands' and routine high dose intravenous immunoglobulins after allogeneic hematopoietic stem cell transplantation were abandoned by many teams in view of limited evidence and high costs. METHODS: This retrospective single-center study examines the impact of change from nursing in 'live islands' to care in single rooms (SR) and from high dose to targeted intravenous immunoglobulins (IVIG) on mortality and infection rate of adult patients receiving an allogeneic stem cell or bone marrow transplantation in two steps and three time cohorts (1993-1997, 1997-2000, 2000-2003). RESULTS: Two hundred forty-eight allogeneic hematopoetic stem cell transplantations were performed in 227 patients. Patient characteristics were comparable in the three cohorts for gender, median age, underlying disease, and disease stage, prophylaxis for graft versus host disease (GvHD) and cytomegalovirus constellation. The incidence of infections (78.4%) and infection rates remained stable (rates/1000 days of neutropenia for sepsis 17.61, for pneumonia 6.76). Cumulative incidence of GvHD and transplant-related mortality did not change over time. CONCLUSIONS: Change from nursing in 'live islands' to SR and reduction of high dose to targeted IVIG did not result in increased infection rates or mortality despite an increase in patient age. These results support the current practice.
Resumo:
To compare the effects of deflazacort (DEFLA) vs. prednisone (PRED) on bone mineral density (BMD), body composition, and lipids, 24 patients with end-stage renal disease were randomized in a double blind design and followed 78 weeks after kidney transplantation. BMD and body composition were assessed using dual energy x-ray absorptiometry. Seventeen patients completed the study. Glucocorticosteroid doses, cyclosporine levels, rejection episodes, and drop-out rates were similar in both groups. Lumbar BMD decreased more in PRED than in DEFLA (P < 0.05), the difference being particularly marked after 24 weeks (9.1 +/- 1.8% vs. 3.0 +/- 2.4%, respectively). Hip BMD decreased from baseline in both groups (P < 0.01), without intergroup differences. Whole body BMD decreased from baseline in PRED (P < 0.001), but not in DEFLA. Lean body mass decreased by approximately 2.5 kg in both groups after 6-12 weeks (P < 0.001), then remained stable. Fat mass increased more (P < 0.01) in PRED than in DEFLA (7.1 +/- 1.8 vs. 3.5 +/- 1.4 kg). Larger increases in total cholesterol (P < 0.03), low density lipoprotein cholesterol (P < 0.01), lipoprotein B2 (P < 0.03), and triglycerides (P = 0.054) were observed in PRED than in DEFLA. In conclusion, using DEFLA instead of PRED in kidney transplant patients is associated with decreased loss of total skeleton and lumbar spine BMD, but does not alter bone loss at the upper femur. DEFLA also helps to prevent fat accumulation and worsening of the lipid profile.
Resumo:
Chronic heart transplant rejection, i.e. cardiac allograft vasculopathy (CAV) is a major adverse prognostic factor after heart transplantation (HTx). This study tested the hypothesis that the relative myocardial blood volume (rBV) as quantified by myocardial contrast echocardiography accurately detects severe CAV as defined by coronary intravascular ultrasound (IVUS).
Resumo:
Bilanzskandale und Missmanagement haben in den vergangenen Jahren den Ruf nach besseren Kontrollmechanismen in der Unternehmensführung laut werden lassen. Audit Committees sind ein wichtiges Werkzeug um eine solche Kontrolle sicherzustellen und sind inzwischen weltweit zum integralen Bestandteil einer guten "Corporate Governance" geworden. Die Audit Committees haben sich in unterschiedlichen kulturellen und rechtlichen Umgebungen etabliert. Wie der Beitrag zeigt, hat die weltweite Zunahme der Bedeutung der "Corporate Governance" das Audit Committee zum Vorzeigebeispiel eines "legal transplant" gemacht.