982 resultados para Renal Transplant Recipients
Resumo:
Cytomegalovirus (CMV) is the single most important infectious agent affecting recipients of organ transplants. To evaluate the incidence and the clinical importance of CMV infection in renal transplants in Brazil, 37 patients submitted to renal allograft transplants were tested periodically for the presence of cytomegalovirus DNA in urine using the polymerase chain reaction (PCR), and for the presence of IgM and IgG antibodies against CMV by enzyme-linked immunosorbent assay (ELISA) and indirect immunofluorescence (IIF). The PCR-amplified products were detected by gel electrophoresis and confirmed by dot-blot hybridization with oligonucleotide probes. Thirty-two of the 37 patients (86.4%) were positive by at least one of the three methods. In six patients, PCR was the only test which detected the probable CMV infection. Ten patients had a positive result by PCR before transplantation. In general, the diagnosis was achieved earlier by PCR than by serologic tests. Active infection occurred more frequently during the first four months after transplantation. Sixteen of the 32 patients (50%) with active CMV infection presented clinical symptoms consistent with CMV infection. Five patients without evidence of active CMV infection by the three tests had only minor clinical manifestations during follow-up. Our results indicate that PCR is a highly sensitive procedure for the early detection of CMV infection and that CMV infection in renal transplant patients is a frequent problem in Brazil.
Resumo:
FTY720 is a new and effective immunosuppressive agent, which produces peripheral blood lymphopenia through a lymphocyte homing effect. We investigated the relationship between the dose of FTY720 or blood concentration (pharmacokinetics, PK) and peripheral lymphopenia (pharmacodynamics, PD) in 23 kidney transplant recipients randomized to receive FTY720 (0.25-2.5 mg/day) or mofetil mycophenolate (2 mg/day) in combination with cyclosporine and steroids. FTY720 dose, blood concentrations and lymphocyte counts were determined weekly before and 4 to 12 weeks after transplantation. The effect of PD was calculated as the absolute lymphocyte count or its reductions. PK/PD modeling was used to find the best-fit model. Mean FTY720 concentrations were 0.36 ± 0.05 (0.25 mg), 0.73 ± 0.12 (0.5 mg), 3.26 ± 0.51 (1 mg), and 7.15 ± 1.41 ng/ml (2.5 mg) between 4 and 12 weeks after transplantation. FTY720 PK was linear with dose (r² = 0.98) and showed low inter- and intra-individual variability. FTY720 produced a dose-dependent increase in mean percent reduction of peripheral lymphocyte counts (38 vs 42 vs 56 vs 77, P < 0.01, respectively). The simple Emax model [E = (Emax * C)/(C + EC50)] was the best-fit PK/PD modeling for FTY720 dose (Emax = 87.8 ± 5.3% and ED50 = 0.48 ± 0.08 mg, r² = 0.94) or concentration (Emax = 78.3 ± 2.9% and EC50 = 0.59 ± 0.09 ng/ml, r² = 0.89) vs effect (% reduction in peripheral lymphocytes). FTY720 PK/PD is dose dependent and follows an Emax model (EC50 = 0.5 mg or 0.6 ng/ml). Using lymphopenia as an FTY720 PD surrogate marker, high % reductions (~80%) in peripheral lymphocytes are required to achieve best efficacy to prevent acute allograft rejection.
Resumo:
Human cytomegalovirus (CMV) infection is common in most people but nearly asymptomatic in immunocompetent individuals. After primary infection the virus persists throughout life in a latent form in a variety of tissues, particularly in precursor cells of the monocytic lineage. CMV reinfection and occurrence of disease are associated with immunosuppressive conditions. Solid organ and bone marrow transplant patients are at high risk for CMV disease as they undergo immunosuppression. Antiviral treatment is effective in controlling viremia, but 10-15% of infected patients can experience CMV disease by the time the drug is withdrawn. In addition, long-term antiviral treatment leads to bone marrow ablation and renal toxicity. Furthermore, control of chronic CMV infection in transplant recipients appears to be dependent on the proper recovery of cellular immunity. Recent advances in the characterization of T-cell functions and identification of distinct functional signatures of T-cell viral responses have opened new perspectives for monitoring transplant individuals at risk of developing CMV disease.
Resumo:
INTRODUCTION: C4d is a marker of antibody-mediated rejection (ABMR) in kidney allografts, although cellular rejection also have C4d deposits. OBJECTIVE: To correlate C4d expression with clinico-pathological parameters and graft outcomes at three years. METHODS: One hundred forty six renal transplantation recipients with graft biopsies by indication were included. C4d staining was performed by paraffin-immunohistochemistry. Graft function and survival were measured, and predictive variables of the outcome were determined by multivariate Cox regression. RESULTS: C4d staining was detected in 48 (31%) biopsies, of which 23 (14.7%) had diffuse and 25 (16%) focal distribution. Pre-transplantation panel reactive antibodies (%PRA) class I and II were significantly higher in C4d positive patients as compared to those C4d negative. Both glomerulitis and pericapillaritis were associated to C4d (p = 0.002 and p < 0.001, respectively). The presence of C4d in biopsies diagnosed as no rejection (NR), acute cellular rejection (ACR) or interstitial fibrosis/ tubular atrophy (IF/TA) did not impact graft function or survival. Compared to NR, ACR and IF/TA C4d-, patients with ABMR C4d+ had the worst graft survival over 3 years (p = 0.034), but there was no difference between ABMR versus NR, ACR and IF/TA that were C4d positive (p = 0.10). In Cox regression, graft function at biopsy and high %PRA levels were predictors of graft loss. CONCLUSIONS: This study confirmed that C4d staining in kidney graft biopsies is a clinically useful marker of ABMR, with well defined clinical and pathological correlations. The impact of C4d deposition in other histologic diagnoses deserves further investigation.
Resumo:
To prevent rejection of kidney transplants, patients must be kept in immunosuppressive therapy for a long time, which includes the use of drugs such as cyclosporine, azathioprine, cyclophosphamide, and prednisone. The action of these drugs reduces the general immune response of transplant patients and thus increases their susceptibility to infections. Moreover, these drugs increase the potential of developing lesions. Therefore, oral hygiene in kidney transplant recipients contributes to maintenance of the transplanted organ and its function. Thus, an investigation of oral lesions could be counted as a notable work. The aim of this study was to investigate oral lesions in a group of 21 kidney transplant patients under immunosuppressive therapy attended during a 1-year period in the Nephrology Department of the Federal University of Sergipe, Brazil. Data related to sex, age, etiology of renal disease, types of renal transplant, time elapsed after transplantation, immunosuppressive treatment, use of concomitant agents, and presence of oral lesions were obtained. All patients received a kidney transplant from a living donor, and the mean posttransplantation follow-up time was 31.6 months; 71.5% used triple immunosuppressive therapy with cyclosporine A, azathioprine, and prednisone. Ten patients were also treated with calcium-channel blockers. Of the 21 transplant patients, 17 (81%) presented oral lesions. Gingival overgrowth was the most common alteration, followed by candidiasis and superficial ulcers. One case of spindle cell carcinoma of the lower lip was observed. Oral cavity can harbor a variety of manifestations related to renal transplantation under immunosuppressive therapy.
Resumo:
BACKGROUND.: Urine is a potentially rich source of biomarkers for monitoring kidney dysfunction. In this study, we have investigated the potential of soluble human leukocyte antigen (sHLA)-DR in the urine for noninvasive monitoring of renal transplant patients. METHODS.: Urinary soluble HLA-DR levels were measured by sandwich enzyme-linked immunosorbent assay in 103 patients with renal diseases or after renal transplantation. sHLA-DR in urine was characterized by Western blotting and mass spectrometry. RESULTS.: Acute graft rejection was associated with a significantly elevated level of urinary sHLA-DR (P<0.0001), compared with recipients with stable graft function or healthy individuals. A receiver operating characteristic curve analysis showed the area under the curve to be 0.88 (P<0.001). At a selected threshold, the sensitivity was 80% and specificity was 98% for detection of acute renal transplant rejection. sHLA-DR was not exosomally associated and was of lower molecular weight compared with the HLA-DR expressed as heterodimer on the plasma membrane of antigen-presenting cells. CONCLUSIONS.: sHLA-DR excreted into urine is a promising indicator of renal transplant rejection.
Resumo:
The toxicity of long-term immunosuppressive therapy has become a major concern in long-term follow-up of heart transplant recipients. In this respect the quality of renal function is undoubtedly linked to cyclosporin A (CsA) drug levels. In cardiac transplantation, specific CsA trough levels have historically been maintained between 250 and 350 micrograms/L in many centers without direct evidence for the necessity of such high levels while using triple-drug immunosuppression. This retrospective analysis compares the incidence of acute and chronic graft rejection as well as overall mortality between groups of patients with high (250 to 350 micrograms/L) and low (150 to 250 micrograms/L) specific CsA trough levels. A total of 332 patients who underwent heart transplantation between October 1985 and October 1992 with a minimum follow-up of 30 days were included in this study (46 women and 276 men; aged, 44 +/- 12 years; mean follow-up, 1,122 +/- 777 days). Standard triple-drug immunosuppression included first-year specific CsA target trough levels of 250 to 300 micrograms/L. Patients were grouped according to their average creatinine level in the first postoperative year (group I, < 130 mumol/L, n = 234; group II, > or = 130 mumol/L, n = 98). The overall 5-year survival excluding the early 30-day mortality was 92% (group I, 216/232) and 91% (group II, 89/98) with 75% of the mortality due to chronic rejection. The rate of rejection for the entire follow-up period was similar in both groups (first year: group I, 3.2 +/- 2.6 rejection/patient/year; group II, 3.6 +/- 2.7 rejection/patient/year; p = not significant).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
The efficacy of everolimus with reduced cyclosporine in de novo heart transplant patients has been demonstrated convincingly in randomized studies. Moreover, everolimus-based immunosuppression in de novo heart transplant recipients has been shown in two randomized trials to reduce the increase in maximal intimal thickness based on intravascular ultrasound, indicating attenuation of cardiac allograft vasculopathy (CAV). Randomized trials of everolimus in de novo heart transplantation have also consistently shown reduced cytomegalovirus infection versus antimetabolite therapy. In maintenance heart transplantation, conversion from calcineurin inhibitors to everolimus has demonstrated a sustained improvement in renal function. In de novo patients, a renal benefit may only be achieved if there is an adequate reduction in exposure to calcineurin inhibitor therapy. Delayed introduction of everolimus may be appropriate in patients at high risk of wound healing complications, e.g. diabetic patients or patients with ventricular assist device. The current evidence base suggests that the most convincing reasons for use of everolimus from the time of heart transplantation are to slow the progression of CAV and to lower the risk of cytomegalovirus infection. A regimen of everolimus with reduced-exposure calcineurin inhibitor and steroids in de novo heart transplant patients represents a welcome addition to the therapeutic armamentarium.
Resumo:
In a randomized, open-label trial, everolimus was compared to cyclosporine in 115 de novo heart transplant recipients. Patients were assigned within 5 days posttransplant to low-exposure everolimus (3–6 ng/mL) with reduced-exposure cyclosporine (n = 56), or standard-exposure cyclosporine (n = 59), with both mycophenolate mofetil and corticosteroids. In the everolimus group, cyclosporine was withdrawn after 7–11 weeks and everolimus exposure increased (6–10 ng/mL). The primary efficacy end point, measured GFR at 12 months posttransplant, was significantly higher with everolimus versus cyclosporine (mean ± SD: 79.8 ± 17.7 mL/min/1.73 m2 vs. 61.5 ± 19.6 mL/min/1.73 m2; p < 0.001). Coronary intravascular ultrasound showed that the mean increase in maximal intimal thickness was smaller (0.03 mm [95% CI 0.01, 0.05 mm] vs. 0.08 mm [95% CI 0.05, 0.12 mm], p = 0.03), and the incidence of cardiac allograft vasculopathy (CAV) was lower (50.0% vs. 64.6%, p = 0.003), with everolimus versus cyclosporine at month 12. Biopsy-proven acute rejection after weeks 7–11 was more frequent with everolimus (p = 0.03). Left ventricular function was not inferior with everolimus versus cyclosporine. Cytomegalovirus infection was less common with everolimus (5.4% vs. 30.5%, p < 0.001); the incidence of bacterial infection was similar. In conclusion, everolimus-based immunosuppression with early elimination of cyclosporine markedly improved renal function after heart transplantation. Since postoperative safety was not jeopardized and development of CAV was attenuated, this strategy may benefit long-term outcome.
Resumo:
AIM Predictors of renal recovery following conversion from calcineurin inhibitor- to proliferation signal inhibitor-based therapy are lacking. We hypothesized that plasma NGAL (P-NGAL) could predict improvement in glomerular filtration rate (GFR) after conversion to everolimus. PATIENTS & METHODS P-NGAL was measured in 88 cardiac transplantation patients (median 5 years post-transplant) with renal dysfunction randomized to continuation of conventional calcineurin inhibitor-based immunosuppression or switching to an everolimus-based regimen. RESULTS P-NGAL correlated with measured GFR (mGFR) at baseline (R(2) = 0.21; p < 0.001). Randomization to everolimus improved mGFR after 1 year (median [25-75 % percentiles]: ΔmGFR 5.5 [-0.5-11.5] vs -1 [-7-4] ml/min/1.73 m(2); p = 0.006). Baseline P-NGAL predicted mGFR after 1 year (R(2) = 0.18; p < 0.001), but this association disappeared after controlling for baseline mGFR. CONCLUSION P-NGAL and GFR correlate with renal dysfunction in long-term heart transplantation recipients. P-NGAL did not predict improvement of renal function after conversion to everolimus-based immunosuppression.
Resumo:
Post-transplant lymphoproliferative disorder (PTLD) complicates 1 to 10% of all transplantations. Previous clinicopathological studies of PTLD have been limited by small numbers, short follow-up times, outdated data, heterogeneity of pooled solid-organ transplant results, and selective inclusion of early-onset disease. We therefore undertake here a retrospective analysis and identify all cases of PTLD that complicated renal transplantation at the Princess Alexandra Hospital between 30 June 1969 and 31 May 2001. Tumour samples were subsequently retrieved for pathological review and for Epstein-Barr virus-encoded RNA in situ hybridisation (EBER-ISH). Of 2,030 renal transplantation patients, 29 (1.4%) developed PTLD after a median period of 0.5 years (range 0.1 to 23.3 years). PTLD patients were more likely to have received cyclosporine (76% versus 62%, P
Resumo:
The aim of this study was to determine the most informative sampling time(s) providing a precise prediction of tacrolimus area under the concentration-time curve (AUC). Fifty-four concentration-time profiles of tacrolimus from 31 adult liver transplant recipients were analyzed. Each profile contained 5 tacrolimus whole-blood concentrations (predose and 1, 2, 4, and 6 or 8 hours postdose), measured using liquid chromatography-tandem mass spectrometry. The concentration at 6 hours was interpolated for each profile, and 54 values of AUC(0-6) were calculated using the trapezoidal rule. The best sampling times were then determined using limited sampling strategies and sensitivity analysis. Linear mixed-effects modeling was performed to estimate regression coefficients of equations incorporating each concentration-time point (C0, C1, C2, C4, interpolated C5, and interpolated C6) as a predictor of AUC(0-6). Predictive performance was evaluated by assessment of the mean error (ME) and root mean square error (RMSE). Limited sampling strategy (LSS) equations with C2, C4, and C5 provided similar results for prediction of AUC(0-6) (R-2 = 0.869, 0.844, and 0.832, respectively). These 3 time points were superior to C0 in the prediction of AUC. The ME was similar for all time points; the RMSE was smallest for C2, C4, and C5. The highest sensitivity index was determined to be 4.9 hours postdose at steady state, suggesting that this time point provides the most information about the AUC(0-12). The results from limited sampling strategies and sensitivity analysis supported the use of a single blood sample at 5 hours postdose as a predictor of both AUC(0-6) and AUC(0-12). A jackknife procedure was used to evaluate the predictive performance of the model, and this demonstrated that collecting a sample at 5 hours after dosing could be considered as the optimal sampling time for predicting AUC(0-6).
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.