982 resultados para Renal Transplant Recipients
Resumo:
Non-nephrotoxic immunosuppressive strategies that allow reduction of calcineurin-inhibitor exposure without compromising safety or efficacy remain a goal in kidney transplantation. Immunosuppression based on the mammalian-target-of-rapamycin inhibitor everolimus was assessed as a strategy for elimination of calcineurin-inhibitor exposure and optimisation of renal-graft function while maintaining efficacy.
Resumo:
A large prospective, open-label, randomized trial evaluated conversion from calcineurin inhibitor (CNI)- to sirolimus (SRL)-based immunosuppression for preservation of renal function in liver transplantation patients. Eligible patients received liver allografts 6-144 months previously and maintenance immunosuppression with CNI (cyclosporine or tacrolimus) since early posttransplantation. In total, 607 patients were randomized (2:1) to abrupt conversion (<24 h) from CNI to SRL (n = 393) or CNI continuation for up to 6 years (n = 214). Between-group changes in baseline-adjusted mean Cockcroft-Gault GFR at month 12 (primary efficacy end point) were not significant. The primary safety end point, noninferiority of cumulative rate of graft loss or death at 12 months, was not met (6.6% vs. 5.6% in the SRL and CNI groups, respectively). Rates of death at 12 months were not significantly different, and no true graft losses (e.g. liver transplantation) were observed during the 12-month period. At 52 weeks, SRL conversion was associated with higher rates of biopsy-confirmed acute rejection (p = 0.02) and discontinuations (p < 0.001), primarily for adverse events. Adverse events were consistent with known safety profiles. In conclusion, liver transplantation patients showed no demonstrable benefit 1 year after conversion from CNI- to SRL-based immunosuppression.
Resumo:
Chronic renal allograft rejection is characterized by alterations in the extracellular matrix compartment and in the proliferation of various cell types. These features are controlled, in part by the metzincin superfamily of metallo-endopeptidases, including matrix metalloproteinases (MMPs), a disintegrin and metalloproteinase (ADAM) and meprin. Therefore, we investigated the regulation of metzincins in the established Fisher to Lewis rat kidney transplant model. Studies were performed using frozen homogenates and paraffin sections of rat kidneys at day 0 (healthy controls) and during periods of chronic rejection at day +60 and day +100 following transplantation. The messenger RNA (mRNA) expression was examined by Affymetrix Rat Expression Array 230A GeneChip and by real-time Taqman polymerase chain reaction analyses. Protein expression was studied by zymography, Western blot analyses, and immunohistology. mRNA levels of MMPs (MMP-2/-11/-12/-14), of their inhibitors (tissue inhibitors of metalloproteinase (TIMP)-1/-2), ADAM-17 and transforming growth factor (TGF)-beta1 significantly increased during chronic renal allograft rejection. MMP-2 activity and immunohistological staining were augmented accordingly. The most important mRNA elevation was observed in the case of MMP-12. As expected, Western blot analyses also demonstrated increased production of MMP-12, MMP-14, and TIMP-2 (in the latter two cases as individual proteins and as complexes). In contrast, mRNA levels of MMP-9/-24 and meprin alpha/beta had decreased. Accordingly, MMP-9 protein levels and meprin alpha/beta synthesis and activity were downregulated significantly. Members of metzincin families (MMP, ADAM, and meprin) and of TIMPs are differentially regulated in chronic renal allograft rejection. Thus, an altered pattern of metzincins may represent novel diagnostic markers and possibly may provide novel targets for future therapeutic interventions.
Resumo:
BACKGROUND: Outcome after lung transplantation (LTx) is affected by the onset of bronchiolitis obliterans syndrome (BOS) and lung function decline. Reduced health-related quality of life (HRQL) and physical mobility have been shown in patients developing BOS, but the impact on the capacity to walk is unknown. We aimed to compare the long-term HRQL and 6-minute walk test (6MWT) between lung recipients affected or not by BOS Grade > or =2. METHODS: Fifty-eight patients were prospectively followed for 5.6 +/- 2.9 years after LTx. Assessments included the St George's Respiratory Questionnaire (SGRQ) and the 6MWT, which were performed yearly. Moreover, clinical complications were recorded to estimate the proportion of the follow-up time lived without clinical intercurrences after transplant. Analyses were performed using adjusted linear regression and repeated-measures analysis of variance. RESULTS: BOS was a significant predictor of lower SGRQ scores (p < 0.01) and reduced time free of clinical complications (p = 0.001), but not of 6MWT distance (p = 0.12). At 7 years post-transplant, results were: 69.0 +/- 21.8% vs 86.9 +/- 5.6%, p < 0.05 (SGRQ); 58.5 +/- 21.6% vs 88.7 +/- 11.4%, p < 0.01 (proportion of time lived without clinical complications); and 82.2 +/- 10.9% vs 91.9 +/- 14.2%, p = 0.27 (percent of predicted 6MWT), respectively, for patients with BOS and without BOS. CONCLUSIONS: Despite significantly less time lived without clinical complications and progressive decline of self-reported health status, the capacity to walk of patients affected by BOS remained relatively stable over time. These findings may indicate that the development of moderate to severe BOS does not prevent lung recipients from walking independently and pursuing an autonomous life.
Resumo:
BACKGROUND: Reduced bone mineral density (BMD) is common in adults infected with human immunodeficiency virus (HIV). The role of proximal renal tubular dysfunction (PRTD) and alterations in bone metabolism in HIV-related low BMD are incompletely understood. METHODS: We quantified BMD (dual-energy x-ray absorptiometry), blood and urinary markers of bone metabolism and renal function, and risk factors for low BMD (hip or spine T score, -1 or less) in an ambulatory care setting. We determined factors associated with low BMD and calculated 10-year fracture risks using the World Health Organization FRAX equation. RESULTS: We studied 153 adults (98% men; median age, 48 years; median body mass index, 24.5; 67 [44%] were receiving tenofovir, 81 [53%] were receiving a boosted protease inhibitor [PI]). Sixty-five participants (42%) had low BMD, and 11 (7%) had PRTD. PI therapy was associated with low BMD in multivariable analysis (odds ratio, 2.69; 95% confidence interval, 1.09-6.63). Tenofovir use was associated with increased osteoblast and osteoclast activity (P< or = .002). The mean estimated 10-year risks were 1.2% for hip fracture and 5.4% for any major osteoporotic fracture. CONCLUSIONS: In this mostly male population, low BMD was significantly associated with PI therapy. Tenofovir recipients showed evidence of increased bone turnover. Measurement of BMD and estimation of fracture risk may be warranted in treated HIV-infected adults.
Resumo:
To compare the effects of deflazacort (DEFLA) vs. prednisone (PRED) on bone mineral density (BMD), body composition, and lipids, 24 patients with end-stage renal disease were randomized in a double blind design and followed 78 weeks after kidney transplantation. BMD and body composition were assessed using dual energy x-ray absorptiometry. Seventeen patients completed the study. Glucocorticosteroid doses, cyclosporine levels, rejection episodes, and drop-out rates were similar in both groups. Lumbar BMD decreased more in PRED than in DEFLA (P < 0.05), the difference being particularly marked after 24 weeks (9.1 +/- 1.8% vs. 3.0 +/- 2.4%, respectively). Hip BMD decreased from baseline in both groups (P < 0.01), without intergroup differences. Whole body BMD decreased from baseline in PRED (P < 0.001), but not in DEFLA. Lean body mass decreased by approximately 2.5 kg in both groups after 6-12 weeks (P < 0.001), then remained stable. Fat mass increased more (P < 0.01) in PRED than in DEFLA (7.1 +/- 1.8 vs. 3.5 +/- 1.4 kg). Larger increases in total cholesterol (P < 0.03), low density lipoprotein cholesterol (P < 0.01), lipoprotein B2 (P < 0.03), and triglycerides (P = 0.054) were observed in PRED than in DEFLA. In conclusion, using DEFLA instead of PRED in kidney transplant patients is associated with decreased loss of total skeleton and lumbar spine BMD, but does not alter bone loss at the upper femur. DEFLA also helps to prevent fat accumulation and worsening of the lipid profile.
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
Progressive interstitial fibrosis and tubular atrophy (IF/TA) is a leading cause of chronic allograft dysfunction. Increased extracellular matrix remodeling regulated by matrix metalloproteases (MMPs) and their inhibitors (TIMPs) has been implicated in the development of IF/TA. The aim of this study was to investigate whether urinary/serum MMPs/TIMPs correlate with subclinical IF/TA detected in surveillance biopsies within the first 6months post-transplant. We measured eight different MMPs/TIMPs simultaneously in urine and serum samples from patients classified as normal histology (n=15), IF/TA 1 (n=15) and IF/TA 2-3 (n=10). There was no difference in urinary MMPs/TIMPs among the three groups, and only 1/8 serum MMPs/TIMPs (i.e. MMP-1) was significantly elevated in biopsies with IF/TA 2-3 (p=0.01). In addition, urinary/serum MMPs/TIMPs were not different between surveillance biopsies demonstrating an early development of IF/TA (i.e. delta IF/TA≥1 compared to a previous biopsy obtained three months before; n=11) and stable grade of IF/TA (i.e. delta IF/TA=0; n=20). Next, we investigated whether urinary/serum MMP/TIMP levels are elevated during acute subclinical tubulitis in surveillance biopsies obtained within the first 6months post-transplant (n=25). Compared to biopsies with normal histology, serum MMPs/TIMPs were not different; however, all urinary MMP/TIMP levels were numerically higher during subclinical tubulitis (MMP-1, MMP-7, TIMP-1 with p≤0.04). We conclude that urinary/serum MMPs/TIMPs do hardly correlate with existing or early developing IF/TA in surveillance biopsies obtained within the first 6months post-transplant. This could be explained by the dynamic process of extracellular matrix remodeling, which seems to be active during acute tubulo-interstitial injury/inflammation, but not in quiescent IF/TA.
Resumo:
BACKGROUND To cover the shortage of cadaveric organs, new approaches to expand the donor pool are needed. Here we report on a case of domino liver transplantation (DLT) using an organ harvested from a compound heterozygous patient with primary hyperoxaluria (PHO), who underwent combined liver and kidney transplantation. The DLT recipient developed early renal failure with oxaluria. The time to the progression to oxalosis with renal failure in such situations is unknown, but, based on animal data, we hypothesize that calcineurin inhibitors may play a detrimental role. METHODS A cadaveric liver and kidney transplantation was performed in a 52-year-old male with PHO. His liver was used for a 64-year-old patient with a non-resectable, but limited cholangiocarcinoma. RESULTS While the course of the PHO donor was uneventful, in the DLT recipient early post-operative, dialysis-dependent renal failure with hyperoxaluria developed. Histology of a kidney biopsy revealed massive calcium oxalate crystal deposition as the leading aetiological cause. CONCLUSIONS DLT using PHO organs for marginal recipients represents a possible therapeutic approach regarding graft function of the liver. However, it may negatively alter the renal outcome of the recipient in an unpredictable manner, especially with concomitant use of cyclosporin. Therefore, we suggest that, although DLT should be promoted, PHO organs are better excluded from such procedures.
Influence of CYP3A5 genetic variation on everolimus maintenance dosing after cardiac transplantation
Resumo:
BACKGROUND Everolimus (ERL) has become an alternative to calcineurin inhibitors (CNIs) due to its renal-sparing properties, especially in heart transplant (HTx) recipients with kidney dysfunction. However, ERL dosing is challenging due to its narrow therapeutic window combined with high inter-individual pharmacokinetic variability. Our aim was to evaluate the effect of clinical and genetic factors on ERL dosing in a pilot cohort of 37 HTx recipients. METHODS Variants in CYP3A5, CYP3A4, CYP2C8, POR, NR1I2, and ABCB1 were genotyped and clinical data were retrieved from patient charts. RESULTS While ERL trough concentration (C0 ) was within the targeted range for most patients, over 30-fold variability in the dose-adjusted ERL C0 was observed. Regression analysis revealed a significant effect of the non-functional CYP3A5*3 variant on the dose-adjusted ERL C0 (P = 0.031). ERL dose requirement was 0.02 mg/kg/day higher in patients with CYP3A5*1/*3 genotype compared to patients with CYP3A5*3/*3 to reach the targeted C0 (P = 0.041). ERL therapy substantially improved estimated glomerular filtration rate (28.6 ± 6.6 ml/min/1.73m(2) ) in patients with baseline kidney dysfunction. CONCLUSION ERL pharmacokinetics in HTx recipients is highly variable. Our preliminary data on patients on a CNI-free therapy regimen suggest that CYP3A5 genetic variation may contribute to this variability. This article is protected by copyright. All rights reserved.
Resumo:
OBJECTIVES The aetiology of hyposalivation in haematopoietic stem cell transplantation (HSCT) recipients is not fully understood. This study examined the effects of treatment-related aetiological factors, particularly medications, on stimulated salivary flow in HSCT recipients. SUBJECTS AND METHODS Adult HSCT recipients (N = 118, 66 males, 27 autologous and 91 allogeneic transplants) were examined. Stimulated whole salivary flow rates (SWSFR) were measured before HSCT and at 6 and 12 months post-HSCT. Linear regression models were used to analyse the associations of medications and transplant-related factors with salivary flow rates, which were compared to salivary flow rates of generally healthy controls (N = 247). RESULTS The SWSFR of recipients were lower pre-HSCT (mean ± standard deviation, 0.88 ± 0.56 ml/min; P < 0.001), 6 months post-HSCT (0.84 ± 0.61; P < 0.001) and 12 months post-HSCT (1.08 ± 0.67; P = 0.005) than the SWSFR of controls (1.31 ± 0.65). In addition, hyposalivation (<0.7 ml/min) was more frequent among HSCT recipients pre-HSCT (P < 0.001), 6 months post-HSCT (P < 0.001) and 12 months post-HSCT (P = 0.01) than among controls. The SWSFR was observed to improve over time being significantly higher 12 months post-HSCT compared to pre-HSCT (P < 0.001). The observed decrease of salivary flow could not be explained by the examined transplant-related factors and medications. CONCLUSIONS Decreased stimulated salivary flow rates could not be explained by the examined factors alone; these findings indicate that hyposalivation in HSCT recipients exhibits a multifactorial aetiology. CLINICAL RELEVANCE All HSCT recipients should be considered to be at high risk of hyposalivation and consequent oral diseases, and they should be treated accordingly.
Resumo:
Purpose To determine renal oxygenation changes associated with uninephrectomy and transplantation in both native donor kidneys and transplanted kidneys by using blood oxygenation level-dependent (BOLD) MR imaging. Materials and Methods The study protocol was approved by the local ethics committee. Thirteen healthy kidney donors and their corresponding recipients underwent kidney BOLD MR imaging with a 3-T imager. Written informed consent was obtained from each subject. BOLD MR imaging was performed in donors before uninephrectomy and in donors and recipients 8 days, 3 months, and 12 months after transplantation. R2* values, which are inversely related to tissue partial pressure of oxygen, were determined in the cortex and medulla. Longitudinal R2* changes were statistically analyzed by using repeated measures one-way analysis of variance with post hoc pair-wise comparisons. Results R2* values in the remaining kidneys significantly decreased early after uninephrectomy in both the medulla and cortex (P < .003), from 28.9 sec(-1) ± 2.3 to 26.4 sec(-1) ± 2.5 in the medulla and from 18.3 sec(-1) ± 1.5 to 16.3 sec(-1) ± 1.0 in the cortex, indicating increased oxygen content. In donors, R2* remained significantly decreased in both the medulla and cortex at 3 (P < .01) and 12 (P < .01) months. In transplanted kidneys, R2* remained stable during the first year after transplantation, with no significant change. Among donors, cortical R2* was found to be negatively correlated with estimated glomerular filtration rate (R = -0.47, P < .001). Conclusion The results suggest that BOLD MR imaging may potentially be used to monitor renal functional changes in both remaining and corresponding transplanted kidneys. (©) RSNA, 2016.
Resumo:
Chronic rejection, the most important cause of long-term graft failure, is thought to result from both alloantigen-dependent and -independent factors. To examine these influences, cytokine dynamics were assessed by semiquantitative competitive reverse transcriptase-PCR and by immunohistology in an established rat model of chronic rejection lf renal allografts. Isograft controls develop morphologic and immunohistologic changes that are similar to renal allograft changes, although quantitatively less intense and at a delayed speed; these are thought to occur secondary to antigen-independent events. Sequential cytokine expression was determined throughout the process. During an early reversible allograft rejection episode, both T-cell associated [interleukin (IL) 2, IL-2 receptor, IL-4, and interferon gamma] and macrophage (IL-1 alpha, tumor necrosis factor alpha, and IL-6) products were up-regulated despite transient immunosuppression. RANTES (regulated upon activation, normal T-cell expressed and secreted) peaked at 2 weeks; intercellular adhesion molecule (ICAM-1) was maximally expressed at 6 weeks. Macrophage products such as monocyte chemoattractant protein (MCP-1) increased dramatically (to 10 times), presaging intense peak macrophage infiltration at 16 weeks. In contrast, in isografts, ICAM-1 peaked at 24 weeks. MCP-1 was maximally expressed at 52 weeks, commensurate with a progressive increase in infiltrating macrophages. Cytokine expression in the spleen of allograft and isograft recipients was insignificant. We conclude that chronic rejection of kidney allografts in rats is predominantly a local macrophage-dependent event with intense up-regulation of macrophage products such as MCP-1, IL-6, and inducible nitric oxide synthase. The cytokine expression in isografts emphasizes the contribution of antigen-independent events. The dynamics of RANTES expression between early and late phases of chronic rejection suggest a key role in mediating the events of the chronic process.