93 resultados para Renal Transplant Recipients
Resumo:
OBJECTIVE The aim of this study was to investigate the performance of the arterial enhancement fraction (AEF) in multiphasic computed tomography (CT) acquisitions to detect hepatocellular carcinoma (HCC) in liver transplant recipients in correlation with the pathologic analysis of the corresponding liver explants. MATERIALS AND METHODS Fifty-five transplant recipients were analyzed: 35 patients with 108 histologically proven HCC lesions and 20 patients with end-stage liver disease without HCC. Six radiologists looked at the triphasic CT acquisitions with the AEF maps in a first readout. For the second readout without the AEF maps, 3 radiologists analyzed triphasic CT acquisitions (group 1), whereas the other 3 readers had 4 contrast acquisitions available (group 2). A jackknife free-response reader receiver operating characteristic analysis was used to compare the readout performance of the readers. Receiver operating characteristic analysis was used to determine the optimal cutoff value of the AEF. RESULTS The figure of merit (θ = 0.6935) for the conventional triphasic readout was significantly inferior compared with the triphasic readout with additional use of the AEF (θ = 0.7478, P < 0.0001) in group 1. There was no significant difference between the fourphasic conventional readout (θ = 0.7569) and the triphasic readout (θ = 0.7615, P = 0.7541) with the AEF in group 2. Without the AEF, HCC lesions were detected with a sensitivity of 30.7% (95% confidence interval [CI], 25.5%-36.4%) and a specificity of 97.1% (96.0%-98.0%) by group 1 looking at 3 CT acquisition phases and with a sensitivity of 42.1% (36.2%-48.1%) and a specificity of 97.5% (96.4%-98.3%) in group 2 looking at 4 CT acquisition phases. Using the AEF maps, both groups looking at the same 3 acquisition phases, the sensitivity was 47.7% (95% CI, 41.9%-53.5%) with a specificity of 97.4% (96.4%-98.3%) in group 1 and 49.8% (95% CI, 43.9%-55.8%)/97.6% (96.6%-98.4%) in group 2. The optimal cutoff for the AEF was 50%. CONCLUSION The AEF is a helpful tool to screen for HCC with CT. The use of the AEF maps may significantly improve HCC detection, which allows omitting the fourth CT acquisition phase and thus making a 25% reduction of radiation dose possible.
Resumo:
Regular physical activity beneficially impacts the risk of onset and progression of several chronic diseases. However, research regarding the effects of exercising on chronic liver diseases is relatively recent. Most authors focused on non-alcoholic fatty liver disease (NAFLD), in which increasing clinical and experimental data indicate that skeletal muscle cross-talking to the adipose tissue and the liver regulates intrahepatic fat storage. In this setting physical activity is considered required in combination with calories restriction to allow an effective decrease of intrahepatic lipid component, and despite that evidence is not conclusive, some studies suggest that vigorous activity might be more beneficial than moderate activity to improve NAFLD/NASH. Evidence regarding the effects of exercise on the risk of hepatocellular carcinoma is scarce; some epidemiological studies indicate a lower risk in patients regularly and vigorously exercising. In compensated cirrhosis exercise acutely increases portal pressure, but in longer term it has been proved safe and probably beneficial. Decreased aerobic capacity (VO2) correlates with mortality in patients with decompensated cirrhosis, who are almost invariably sarcopenic. In these patients VO2 is improved by physical activity, which might also reduce the risk of hepatic encephalopathy through an increase in skeletal muscle mass. In solid organ transplantation recipients exercise is able to improve lean mass, muscle strength and as a consequence, aerobic capacity. Few data exist in liver transplant recipients, in whom exercise should be object of future studies given its high potential of providing long-term beneficial effects. Despite evidence is far from complete, physical activity should be seen as an important part of the management of patients with liver disease in order to improve their clinical outcome. This article is protected by copyright. All rights reserved.
Resumo:
Cardiac allograft vasculopathy (CAV) is a form of accelerated atherosclerosis, which represents the leading cause of late morbidity and mortality after heart transplantation. The recent bioresorbable vascular scaffold (BVS) technology represents a potential novel therapeutic tool, in the context of CAV, by allowing transient scaffolding and concomitant vessel healing. Eligible subjects will be treated by using the Absorb Everolimus-Eluting BVS (Abbott Vascular, Santa Clara, CA, USA), and evaluated at pre-determined time points, up to 3 years since the index procedure. Both clinical and imaging data will be collected in dedicated case report forms (CRF). All imaging data will be analyzed in an independent core laboratory. The primary aim of the study is to evaluate the angiographic performance at 1 year of second-generation Absorb BVS, in heart transplant recipients affected by CAV.
Resumo:
Cytomegalovirus (CMV) is a highly complex pathogen which, despite modern prophylactic regimens, continues to affect a high proportion of thoracic organ transplant recipients. The symptomatic manifestations of CMV infection are compounded by adverse indirect effects induced by the multiple immunomodulatory actions of CMV. These include a higher risk of acute rejection, cardiac allograft vasculopathy after heart transplantation, and potentially bronchiolitis obliterans syndrome in lung transplant recipients, with a greater propensity for opportunistic secondary infections. Prophylaxis for CMV using antiviral agents (typically oral valganciclovir or intravenous ganciclovir) is now almost universal, at least in high-risk transplants (D+/R-). Even with extended prophylactic regimens, however, challenges remain. The CMV events can still occur despite antiviral prophylaxis, including late-onset infection or recurrent disease, and patients with ganciclovir-resistant CMV infection or who are intolerant to antiviral therapy require alternative strategies. The CMV immunoglobulin (CMVIG) and antiviral agents have complementary modes of action. High-titer CMVIG preparations provide passive CMV-specific immunity but also exert complex immunomodulatory properties which augment the antiviral effect of antiviral agents and offer the potential to suppress the indirect effects of CMV infection. This supplement discusses the available data concerning the immunological and clinical effects of CMVIG after heart or lung transplantation
Resumo:
BACKGROUND Racial disparities in kidney transplantation in children have been found in the United States, but have not been studied before in Europe. STUDY DESIGN Cohort study. SETTING & PARTICIPANTS Data were derived from the ESPN/ERA-EDTA Registry, an international pediatric renal registry collecting data from 36 European countries. This analysis included 1,134 young patients (aged ≤19 years) from 8 medium- to high-income countries who initiated renal replacement therapy (RRT) in 2006 to 2012. FACTOR Racial background. OUTCOMES & MEASUREMENTS Differences between racial groups in access to kidney transplantation, transplant survival, and overall survival on RRT were examined using Cox regression analysis while adjusting for age at RRT initiation, sex, and country of residence. RESULTS 868 (76.5%) patients were white; 59 (5.2%), black; 116 (10.2%), Asian; and 91 (8.0%), from other racial groups. After a median follow-up of 2.8 (range, 0.1-3.0) years, we found that black (HR, 0.49; 95% CI, 0.34-0.72) and Asian (HR, 0.54; 95% CI, 0.41-0.71) patients were less likely to receive a kidney transplant than white patients. These disparities persisted after adjustment for primary renal disease. Transplant survival rates were similar across racial groups. Asian patients had higher overall mortality risk on RRT compared with white patients (HR, 2.50; 95% CI, 1.14-5.49). Adjustment for primary kidney disease reduced the effect of Asian background, suggesting that part of the association may be explained by differences in the underlying kidney disease between racial groups. LIMITATIONS No data for socioeconomic status, blood group, and HLA profile. CONCLUSIONS We believe this is the first study examining racial differences in access to and outcomes of kidney transplantation in a large European population. We found important differences with less favorable outcomes for black and Asian patients. Further research is required to address the barriers to optimal treatment among racial minority groups.
Resumo:
OBJECTIVES:: To determine prevalence and characteristics of end-stage renal diseases (ESRD) [dialysis and renal transplantation (RT)] among European HIV-infected patients. METHODS:: Cross-sectional multicenter survey of EuroSIDA clinics during 2008. RESULTS:: Prevalence of ESRD was 0.5%. Of 122 patients with ESRD 96 were on dialysis and 26 had received a RT. Median age was 47 years, 73% were males and 43% were black. Median duration of HIV infection was 11 years. Thirty-three percent had prior AIDS; 91% were receiving antiretrovirals; and 88% had undetectable viral load. Median CD4T-cell count was 341 cells per cubic millimetre; 20.5% had hepatitis C coinfection. Most frequent causes of ESRD were HIV-associated nephropathy (46%) and other glomerulonephritis (28%). Hemodialysis (93%) was the most common dialysis modality; 34% of patients were on the RT waiting list. A poor HIV control was the reason for exclusion from RT waiting list in 22.4% of cases. All the RT recipients were all alive at the time of the survey. Acute rejection was reported in 8 patients (30%). Functioning graft was present in 21 (80%). CONCLUSIONS:: This is the first multinational cross-sectional study of ESRD among European HIV population. Low prevalence of ESRD was found. Two-thirds of patients were excluded from RT for non-HIV/AIDS-related pathologies. Most patients had a functioning graft despite a high acute rejection rate.
Resumo:
Non-nephrotoxic immunosuppressive strategies that allow reduction of calcineurin-inhibitor exposure without compromising safety or efficacy remain a goal in kidney transplantation. Immunosuppression based on the mammalian-target-of-rapamycin inhibitor everolimus was assessed as a strategy for elimination of calcineurin-inhibitor exposure and optimisation of renal-graft function while maintaining efficacy.
Resumo:
A large prospective, open-label, randomized trial evaluated conversion from calcineurin inhibitor (CNI)- to sirolimus (SRL)-based immunosuppression for preservation of renal function in liver transplantation patients. Eligible patients received liver allografts 6-144 months previously and maintenance immunosuppression with CNI (cyclosporine or tacrolimus) since early posttransplantation. In total, 607 patients were randomized (2:1) to abrupt conversion (<24 h) from CNI to SRL (n = 393) or CNI continuation for up to 6 years (n = 214). Between-group changes in baseline-adjusted mean Cockcroft-Gault GFR at month 12 (primary efficacy end point) were not significant. The primary safety end point, noninferiority of cumulative rate of graft loss or death at 12 months, was not met (6.6% vs. 5.6% in the SRL and CNI groups, respectively). Rates of death at 12 months were not significantly different, and no true graft losses (e.g. liver transplantation) were observed during the 12-month period. At 52 weeks, SRL conversion was associated with higher rates of biopsy-confirmed acute rejection (p = 0.02) and discontinuations (p < 0.001), primarily for adverse events. Adverse events were consistent with known safety profiles. In conclusion, liver transplantation patients showed no demonstrable benefit 1 year after conversion from CNI- to SRL-based immunosuppression.
Resumo:
Chronic renal allograft rejection is characterized by alterations in the extracellular matrix compartment and in the proliferation of various cell types. These features are controlled, in part by the metzincin superfamily of metallo-endopeptidases, including matrix metalloproteinases (MMPs), a disintegrin and metalloproteinase (ADAM) and meprin. Therefore, we investigated the regulation of metzincins in the established Fisher to Lewis rat kidney transplant model. Studies were performed using frozen homogenates and paraffin sections of rat kidneys at day 0 (healthy controls) and during periods of chronic rejection at day +60 and day +100 following transplantation. The messenger RNA (mRNA) expression was examined by Affymetrix Rat Expression Array 230A GeneChip and by real-time Taqman polymerase chain reaction analyses. Protein expression was studied by zymography, Western blot analyses, and immunohistology. mRNA levels of MMPs (MMP-2/-11/-12/-14), of their inhibitors (tissue inhibitors of metalloproteinase (TIMP)-1/-2), ADAM-17 and transforming growth factor (TGF)-beta1 significantly increased during chronic renal allograft rejection. MMP-2 activity and immunohistological staining were augmented accordingly. The most important mRNA elevation was observed in the case of MMP-12. As expected, Western blot analyses also demonstrated increased production of MMP-12, MMP-14, and TIMP-2 (in the latter two cases as individual proteins and as complexes). In contrast, mRNA levels of MMP-9/-24 and meprin alpha/beta had decreased. Accordingly, MMP-9 protein levels and meprin alpha/beta synthesis and activity were downregulated significantly. Members of metzincin families (MMP, ADAM, and meprin) and of TIMPs are differentially regulated in chronic renal allograft rejection. Thus, an altered pattern of metzincins may represent novel diagnostic markers and possibly may provide novel targets for future therapeutic interventions.
Resumo:
BACKGROUND: Outcome after lung transplantation (LTx) is affected by the onset of bronchiolitis obliterans syndrome (BOS) and lung function decline. Reduced health-related quality of life (HRQL) and physical mobility have been shown in patients developing BOS, but the impact on the capacity to walk is unknown. We aimed to compare the long-term HRQL and 6-minute walk test (6MWT) between lung recipients affected or not by BOS Grade > or =2. METHODS: Fifty-eight patients were prospectively followed for 5.6 +/- 2.9 years after LTx. Assessments included the St George's Respiratory Questionnaire (SGRQ) and the 6MWT, which were performed yearly. Moreover, clinical complications were recorded to estimate the proportion of the follow-up time lived without clinical intercurrences after transplant. Analyses were performed using adjusted linear regression and repeated-measures analysis of variance. RESULTS: BOS was a significant predictor of lower SGRQ scores (p < 0.01) and reduced time free of clinical complications (p = 0.001), but not of 6MWT distance (p = 0.12). At 7 years post-transplant, results were: 69.0 +/- 21.8% vs 86.9 +/- 5.6%, p < 0.05 (SGRQ); 58.5 +/- 21.6% vs 88.7 +/- 11.4%, p < 0.01 (proportion of time lived without clinical complications); and 82.2 +/- 10.9% vs 91.9 +/- 14.2%, p = 0.27 (percent of predicted 6MWT), respectively, for patients with BOS and without BOS. CONCLUSIONS: Despite significantly less time lived without clinical complications and progressive decline of self-reported health status, the capacity to walk of patients affected by BOS remained relatively stable over time. These findings may indicate that the development of moderate to severe BOS does not prevent lung recipients from walking independently and pursuing an autonomous life.
Resumo:
BACKGROUND: Reduced bone mineral density (BMD) is common in adults infected with human immunodeficiency virus (HIV). The role of proximal renal tubular dysfunction (PRTD) and alterations in bone metabolism in HIV-related low BMD are incompletely understood. METHODS: We quantified BMD (dual-energy x-ray absorptiometry), blood and urinary markers of bone metabolism and renal function, and risk factors for low BMD (hip or spine T score, -1 or less) in an ambulatory care setting. We determined factors associated with low BMD and calculated 10-year fracture risks using the World Health Organization FRAX equation. RESULTS: We studied 153 adults (98% men; median age, 48 years; median body mass index, 24.5; 67 [44%] were receiving tenofovir, 81 [53%] were receiving a boosted protease inhibitor [PI]). Sixty-five participants (42%) had low BMD, and 11 (7%) had PRTD. PI therapy was associated with low BMD in multivariable analysis (odds ratio, 2.69; 95% confidence interval, 1.09-6.63). Tenofovir use was associated with increased osteoblast and osteoclast activity (P< or = .002). The mean estimated 10-year risks were 1.2% for hip fracture and 5.4% for any major osteoporotic fracture. CONCLUSIONS: In this mostly male population, low BMD was significantly associated with PI therapy. Tenofovir recipients showed evidence of increased bone turnover. Measurement of BMD and estimation of fracture risk may be warranted in treated HIV-infected adults.
Resumo:
To compare the effects of deflazacort (DEFLA) vs. prednisone (PRED) on bone mineral density (BMD), body composition, and lipids, 24 patients with end-stage renal disease were randomized in a double blind design and followed 78 weeks after kidney transplantation. BMD and body composition were assessed using dual energy x-ray absorptiometry. Seventeen patients completed the study. Glucocorticosteroid doses, cyclosporine levels, rejection episodes, and drop-out rates were similar in both groups. Lumbar BMD decreased more in PRED than in DEFLA (P < 0.05), the difference being particularly marked after 24 weeks (9.1 +/- 1.8% vs. 3.0 +/- 2.4%, respectively). Hip BMD decreased from baseline in both groups (P < 0.01), without intergroup differences. Whole body BMD decreased from baseline in PRED (P < 0.001), but not in DEFLA. Lean body mass decreased by approximately 2.5 kg in both groups after 6-12 weeks (P < 0.001), then remained stable. Fat mass increased more (P < 0.01) in PRED than in DEFLA (7.1 +/- 1.8 vs. 3.5 +/- 1.4 kg). Larger increases in total cholesterol (P < 0.03), low density lipoprotein cholesterol (P < 0.01), lipoprotein B2 (P < 0.03), and triglycerides (P = 0.054) were observed in PRED than in DEFLA. In conclusion, using DEFLA instead of PRED in kidney transplant patients is associated with decreased loss of total skeleton and lumbar spine BMD, but does not alter bone loss at the upper femur. DEFLA also helps to prevent fat accumulation and worsening of the lipid profile.
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
Progressive interstitial fibrosis and tubular atrophy (IF/TA) is a leading cause of chronic allograft dysfunction. Increased extracellular matrix remodeling regulated by matrix metalloproteases (MMPs) and their inhibitors (TIMPs) has been implicated in the development of IF/TA. The aim of this study was to investigate whether urinary/serum MMPs/TIMPs correlate with subclinical IF/TA detected in surveillance biopsies within the first 6months post-transplant. We measured eight different MMPs/TIMPs simultaneously in urine and serum samples from patients classified as normal histology (n=15), IF/TA 1 (n=15) and IF/TA 2-3 (n=10). There was no difference in urinary MMPs/TIMPs among the three groups, and only 1/8 serum MMPs/TIMPs (i.e. MMP-1) was significantly elevated in biopsies with IF/TA 2-3 (p=0.01). In addition, urinary/serum MMPs/TIMPs were not different between surveillance biopsies demonstrating an early development of IF/TA (i.e. delta IF/TA≥1 compared to a previous biopsy obtained three months before; n=11) and stable grade of IF/TA (i.e. delta IF/TA=0; n=20). Next, we investigated whether urinary/serum MMP/TIMP levels are elevated during acute subclinical tubulitis in surveillance biopsies obtained within the first 6months post-transplant (n=25). Compared to biopsies with normal histology, serum MMPs/TIMPs were not different; however, all urinary MMP/TIMP levels were numerically higher during subclinical tubulitis (MMP-1, MMP-7, TIMP-1 with p≤0.04). We conclude that urinary/serum MMPs/TIMPs do hardly correlate with existing or early developing IF/TA in surveillance biopsies obtained within the first 6months post-transplant. This could be explained by the dynamic process of extracellular matrix remodeling, which seems to be active during acute tubulo-interstitial injury/inflammation, but not in quiescent IF/TA.
Resumo:
BACKGROUND To cover the shortage of cadaveric organs, new approaches to expand the donor pool are needed. Here we report on a case of domino liver transplantation (DLT) using an organ harvested from a compound heterozygous patient with primary hyperoxaluria (PHO), who underwent combined liver and kidney transplantation. The DLT recipient developed early renal failure with oxaluria. The time to the progression to oxalosis with renal failure in such situations is unknown, but, based on animal data, we hypothesize that calcineurin inhibitors may play a detrimental role. METHODS A cadaveric liver and kidney transplantation was performed in a 52-year-old male with PHO. His liver was used for a 64-year-old patient with a non-resectable, but limited cholangiocarcinoma. RESULTS While the course of the PHO donor was uneventful, in the DLT recipient early post-operative, dialysis-dependent renal failure with hyperoxaluria developed. Histology of a kidney biopsy revealed massive calcium oxalate crystal deposition as the leading aetiological cause. CONCLUSIONS DLT using PHO organs for marginal recipients represents a possible therapeutic approach regarding graft function of the liver. However, it may negatively alter the renal outcome of the recipient in an unpredictable manner, especially with concomitant use of cyclosporin. Therefore, we suggest that, although DLT should be promoted, PHO organs are better excluded from such procedures.