994 resultados para Post-transplant Malignancy
Resumo:
The selection of liver transplant candidates with hepatocellular carcinoma (HCC) is currently validated based on Milan criteria. The use of extended criteria has remained a matter of debate, mainly because of the absence of prospective validation. The present prospective study recruited patients according to the previously proposed Total Tumor Volume (TTV ≤115 cm(3) )/alpha fetoprotein (AFP ≤400 ng/ml) score. Patients with AFP >400 ng/ml were excluded, and as such the Milan group was modified to include only patients with AFP <400 ng/ml; these patients were compared to patients beyond Milan, but within TTV/AFP. From January 2007 to March 2013, 233 patients with HCC were listed for liver transplantation. Of them, 195 patients were within Milan, and 38 beyond Milan but within TTV/AFP. The average follow-up from listing was 33,9 ±24,9 months. The risk of drop-out was higher for patients beyond Milan but within TTV/AFP (16/38, 42,1%), than for patients within Milan (49/195, 25,1%, p=0,033). In parallel, intent-to-treat survival from listing was lower in the patients beyond Milan (53,8% vs. 71,6% at four years, p<0,001). After a median waiting time of 8 months, 166 patients were transplanted, 134 patients within Milan criteria, and 32 beyond Milan but within TTV/AFP. They demonstrated acceptable and similar recurrence rates (4,5% vs. 9,4%, p=0,138) and post-transplant survivals (78,7% vs. 74,6% at four years, p=0,932). CONCLUSION Based on the present prospective study, HCC liver transplant candidate selection could be expanded to the TTV (≤115 cm(3) )/AFP (≤400 ng/ml) criteria in centers with at least 8-month waiting time. An increased risk of drop-out on the waiting list can be expected but with equivalent and satisfactory post-transplant survival. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. METHODS A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. RESULTS Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CONCLUSIONS CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.
Resumo:
AIM Predictors of renal recovery following conversion from calcineurin inhibitor- to proliferation signal inhibitor-based therapy are lacking. We hypothesized that plasma NGAL (P-NGAL) could predict improvement in glomerular filtration rate (GFR) after conversion to everolimus. PATIENTS & METHODS P-NGAL was measured in 88 cardiac transplantation patients (median 5 years post-transplant) with renal dysfunction randomized to continuation of conventional calcineurin inhibitor-based immunosuppression or switching to an everolimus-based regimen. RESULTS P-NGAL correlated with measured GFR (mGFR) at baseline (R(2) = 0.21; p < 0.001). Randomization to everolimus improved mGFR after 1 year (median [25-75 % percentiles]: ΔmGFR 5.5 [-0.5-11.5] vs -1 [-7-4] ml/min/1.73 m(2); p = 0.006). Baseline P-NGAL predicted mGFR after 1 year (R(2) = 0.18; p < 0.001), but this association disappeared after controlling for baseline mGFR. CONCLUSION P-NGAL and GFR correlate with renal dysfunction in long-term heart transplantation recipients. P-NGAL did not predict improvement of renal function after conversion to everolimus-based immunosuppression.
Resumo:
Alternative measures to trough concentrations [non-trough concentrations and limited area under the concentration-time curve (AUC)] have been shown to better predict tacrolimus AUC. The aim of this study was to determine if these are also better predictors of adverse outcomes in long term liver transplant recipients. The associations between tacrolimus trough concentrations (C-0), non-trough concentrations (C-1, C-2, C-4, C-6/8), and AUC(0-12) and the occurrence of hypertension, hyperkalaemia, hyperglycaemia and nephrotoxicity were assessed in 34 clinically stable liver transplant patients. The most common adverse outcome was hypertension, prevalence of 36%. Hyperkalaemia and hyperglycaemia had a prevalence of 21% and 13%, respectively. A sequential population pharmacokinetic/pharmacodynamic approach was implemented. No significant association between predicted C-0, C-1, C-2, C-4, C-6/8 or AUC(0-12) and adverse effects could be found. Tacrolimus concentrations and AUC measures were in the same range in patients with and without adverse effects. Measures reported to provide benefit, preventing graft rejection and minimizing acute adverse effects in the early post-transplant period, were not able to predict adverse effects in stable adult liver recipients whose trough concentrations were maintained in the notional target range.
Resumo:
WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT • Currently tacrolimus is the mainstay of immunosuppression for most children undergoing liver transplantation (LT). • The clinical use of this agent, however, is complicated by its various adverse effects (mainly nephrotoxicity), its narrow therapeutic-index and considerable pharmacokinetic variability. • The low and variable oral bioavailability of tacrolimus is thought to result from the action of the multidrug efflux-pump P-glycoprotein, encoded by the ABCB1 gene. WHAT THIS STUDY ADDS • A significant association between ABCB1 genetic polymorphisms and tacrolimus-associated nephrotoxicity in paediatric patients following LT is reported for the first time. Genotyping such polymorphisms may have the potential to individualize better initial tacrolimus therapy and enhance drug safety. • The long-term effect of ABCB1 polymorphisms on tacrolimus trough concentrations were investigated up to 5 years post-transplantation. A significant effect of intestinal P-glycoprotein genotypes on tacrolimus pharmacokinetics was found at 3 and 4 years post-transplantation suggesting that the effect is maintained long term. AIMS - The aim of this study was to investigate the influence of genetic polymorphisms in ABCB1 on the incidence of nephrotoxicity and tacrolimus dosage-requirements in paediatric patients following liver transplantation. METHODS - Fifty-one paediatric liver transplant recipients receiving tacrolimus were genotyped for ABCB1 C1236>T, G2677>T and C3435>T polymorphisms. Dose-adjusted tacrolimus trough concentrations and estimated glomerular filtration rates (EGFR) indicative of renal toxicity were determined and correlated with the corresponding genotypes. RESULTS - The present study revealed a higher incidence of the ABCB1 variant-alleles examined among patients with renal dysfunction (≥30% reduction in EGFR) at 6 months post-transplantation (1236T allele: 63.3% vs 37.5% in controls, P= 0.019; 2677T allele: 63.3% vs. 35.9%, p = 0.012; 3435T allele: 60% vs. 39.1%, P= 0.057). Carriers of the G2677->T variant allele also had a significant reduction (%) in EGFR at 12 months post-transplant (mean difference = 22.6%; P= 0.031). Haplotype analysis showed a significant association between T-T-T haplotypes and an increased incidence of nephrotoxicity at 6 months post-transplantation (haplotype-frequency = 52.9% in nephrotoxic patients vs 29.4% in controls; P= 0.029). Furthermore, G2677->T and C3435->T polymorphisms and T-T-T haplotypes were significantly correlated with higher tacrolimus dose-adjusted pre-dose concentrations at various time points examined long after drug initiation. CONCLUSIONS - These findings suggest that ABCB1 polymorphisms in the native intestine significantly influence tacrolimus dosage-requirement in the stable phase after transplantation. In addition, ABCB1 polymorphisms in paediatric liver transplant recipients may predispose them to nephrotoxicity over the first year post-transplantation. Genotyping future transplant recipients for ABCB1 polymorphisms, therefore, could have the potential to individualize better tacrolimus immunosuppressive therapy and enhance drug safety.
Resumo:
Cork oak tree (Quercus suber L.), in Portugal, is considered the national tree and have special demands and legal protection when dealing with silviculture management (pruning, debarking, thinning). Being a species of slow growth, cork oak transplanting procedures can be a valuable asset either from the economic or ecological rationales to relocate trees, re-populate areas affected by high tree mortality, increase tree density to control erosion on montado ecosystems or landscape design. This study focuses the impacts and physiological responses of ten juvenile rain fed cork oak trees (with diameter at breast height between 6 and 16cm), when subjected to transplant operations. The work was conducted in a cork oak woodland experimental plot at the campus of the University of Évora (SW Portugal), during the year of 2015. Tree’s transplants were performed with a truck-mounted hydraulic spade transplanter coupled with a proposed methodology to maximize tree survival rates, addressing techniques to limit canopy transpiration and to improve root systems prior to transplant. Tree ecophysiological indicators (sap flow, leaf water potentials and stomatal conductance) were monitored comprising the periods before and after transplant operations, and water stress avoidance practices were established to promote post-transplant tree status recovery, including irrigation to match average daily accumulated sap flow. Transplant operations were considered successful when the tree's water uptake inferred from sap flow exhibited a high correlation with solar radiation and returned to its undisturbed or pre-transplant water potential gradients in the following 2 to 3 weeks. The post-transplant tree nourishment follow up included permanent sap flow measurements and identified the time elapsed after transplantation from which the tree recovers its normal transpiration thresholds and response. Our results suggest that by following the proposed methodology the sampled cork oak trees exhibited a transplant success rate of 90%.
Resumo:
INTRODUCTION: The orthotopic left lung transplantation model in rats has been developed to answer a variety of scientific questions in transplant immunology and in the related fields of respiratory diseases. However, its widespread use has been hampered by the complexity of the procedure. AIM OF THE RESEARCH: Our purpose is to provide a detailed description of the procedure of this technique, including the complications and difficulties from the very first microsurgical step until the ultimate successful completion of the transplant procedure. MATERIALS AND METHODS: The transplant procedures were performed by two collaborating transplant surgeons with microsurgical and thoracic surgery skills. A total of 150 left lung transplants in rats were performed. Twenty-seven syngeneic (Lewis to Lewis) and 123 allogeneic (Brown-Norway to Lewis) lung transplants were performed using the cuff technique. RESULTS: In first 50 transplant procedures, post-transplant survival rate was 74% of which 54% reached the end-point of 3 or 7 days post-transplant; whole complication rate was 66%. In the subsequent 50 transplant surgeries (from 51 to 100) post-transplant survival rate increased to 88% of which 56% reached the end-point; whole complication rate was 32 %. In the final 50 transplants (from 101 to 150) post-transplant survival rate was confirmed to be 88% of which 74% reached the end-point; whole complication rate was again 32 %. CONCLUSIONS: One hundred-fifty transplants can represent a reasonable number of procedures to obtain a satisfactory surgical outcome. Training period with simpler animal models is mandatory to develop anesthesiological and microsurgical skills required for successfully develop this model. The collaboration between at least two microsurgeons is mandatory to perform all the simultaneous procedures required for completing the transplant surgery.
Resumo:
Aim: A positive effect of liver transplantation on health-related quality of life (HRQOL) has been well documented in previous studies using generic instruments. Our aim was to re-evaluate different aspects of HRQOL before and after liver transplantation with a relatively new questionnaire the `liver disease quality of life` (LDQOL). Methods: The LDQOL and the Short Form 36 (SF-36) questionnaires were applied to ambulatory patients, either in the transplant list (n=65) or after 6 months to 5 years of liver transplant (n=61). The aetiology of cirrhosis, comorbidities, model for end-stage liver disease (MELD) Child-Pugh scores and recurrence of liver disease after liver transplantation were analysed using the Mann-Whitney and Kruskall-Wallis tests. Results: In patients awaiting liver transplantation, MELD scores >= 15 and Child-Pugh class C showed statistically significant worse HRQOL, using both the SF-36 and the LDQOL questionnaires. HRQOL in pretransplant patients was found to be significantly worse in those with cirrhosis owing to hepatitis C (n=30) when compared with other aetiologies (n=35) in 2/7 domains of the SF-36 and in 7/12 domains of the LDQOL. Significant deterioration of HRQOL after recurrence of hepatitis C post-transplant was detected with the LDQOL questionnaire although not demonstrated with the SF-36. The statistically significant differences were in the LDQOL domains: symptoms of liver disease, concentration, memory and health distress. Conclusions: The LDQOL, a specific instrument for measuring HRQOL, has shown a greater accuracy in relation to liver symptoms and could demonstrate, with better reliability, impairments before and after liver transplantation.
Resumo:
Background. A retrospective analysis was performed on adult renal transplant recipients to evaluate the relationship between tacrolimus trough concentrations and the development of rejection in the first month after transplant. Methods. A total of 349 concentrations from 29 patients, measured by enzyme-linked immunosorbent assay (ELISA), were recorded. Based on an increased serum creatinine, 12 patients were considered to have organ rejection. Rejection was confirmed by biopsy in five of these. The median trough concentration of tacrolimus over the first month of therapy, or until the time of first rejection was compared in rejecters vs non-rejecters. Results. Median trough concentrations of tacrolimus were found to be lower in biopsy-proven rejecters vs non-rejecters (P=0.03) and all rejecters vs nonrejecters (P = 0.04). The average median concentration (+/- SD) in the biopsy-proven rejecter group was 5.09 +/-1.16 ng/ml, compared to 9.20 +/-3.52 ng/ml in the non-rejecter group. After exclusion of an outlier, the average median concentration in all rejecters was 5.57 +/-1.47 ng/rnl, compared with 9.20 +/-3.52 ng/ml in non-rejecters. A rejection rate of 55% was found for patients with a median trough concentration between 0 and 10 ng/ml. This compared with no observed rejection in patients with a median concentration between 10 and 15 ng/ml. Conclusion. A significant relationship exists between organ rejection and median tacrolimus trough concentrations in the first month post-transplant, with patients displaying low concentrations more likely to reject. In order to minimize rejection in the first month after renal transplantation, trough concentrations greater than 10 ng/ml must be achieved.
Resumo:
Background: In severe aplastic anaemia, the treatment of choice for young patients with a human leucocyte antigen-matched sibling is now established as allogeneic bone marrow transplantation (BMT). In older patients and in those without a matched sibling donor, immunosuppressive therapy is the usual first option. 'Alternative' marrow donors are emerging as an option for those without a matched sibling donor. Aims: To review 10 years of local experience in treating severe aplastic anaemia with BMT and immunosuppressive therapy with emphasis on long-term outcomes. Methods: A retrospective analysis was performed of all patients with severe aplastic anaemia presenting to the Royal Brisbane and Royal Children's Hos- pitals between 1989 and 1999. Data were abstracted regarding patient demographics, pretreatment characteristics and outcome measures, including response rates, overall survival and long-term complications. Results: Twenty-seven consecutive patients were identified, 12 treated with immunosuppression alone and 15 with BMT. In these two groups, transfusion independence was attained in 25% and 100%, respectively, with overall survival being 36% and 100%, respectively. Those treated with immunosuppression were significantly older (median 41.5 versus 22 years, P = 0.008). Long-term survivors of either treatment had extremely low morbidity. Three patients carried pregnancies to term post-transplant. Three patients received alternative donor BMT with correspondingly excellent survival. Conclusions: Patients treated with allogeneic BMT for severe aplastic anaemia enjoyed extremely good long-term survival and minimal morbidity. Patients treated with immunosuppressive therapy had a poorer outcome reflecting their older age and different usage of therapies over the past decade. Optimal treatment strategies for severe aplastic anaemia remain to be determined.
Resumo:
The purpose of this investigation was to assess changes in total energy expenditure (TEE), body weight (BW) and body composition following a peripheral blood stem cell transplant and following participation in a 3-month duration, moderate-intensity, mixed-type exercise programme. The doubly labelled and singly labelled water methods were used to measure TEE and total body water (TBW). Body weight and TBW were then used to calculate percentage body fat (%BF), and fat and fat-free mass (FFM). TEE and body composition measures were assessed pretransplant (PI), immediately post-transplant (PII) and 3 months post-PII (PIII). Following PII, 12 patients were divided equally into a control group (CG) or exercise intervention group (EG). While there was no change in TEE between pre- and post-transplant, BW (P
Resumo:
Our purposes are to determine the impact of histological factors observed in zero-time biopsies on early post transplant kidney allograft function. We specifically want to compare the semi-quantitative Banff Classification of zero time biopsies with quantification of % cortical area fibrosis. Sixty three zero-time deceased donor allograft biopsies were retrospectively semiquantitatively scored using Banff classification. By adding the individual chronic parameters a Banff Chronic Sum (BCS) Score was generated. Percentage of cortical area Picro Sirius Red (%PSR) staining was assessed and calculated with a computer program. A negative linear regression between %PSR/ GFR at 3 year post-transplantation was established (Y=62.08 +-4.6412X; p=0.022). A significant negative correlation between arteriolar hyalinosis (rho=-0.375; p=0.005), chronic interstitial (rho=0.296; p=0.02) , chronic tubular ( rho=0.276; p=0.04) , chronic vascular (rho= -0.360;P=0.007), BCS (rho=-0.413; p=0.002) and GFR at 3 years were found. However, no correlation was found between % PSR, Ci, Ct or BCS. In multivariate linear regression the negative predictive factors of 3 years GFR were: BCS in histological model; donor kidney age, recipient age and black race in clinical model. The BCS seems a good and easy to perform tool, available to every pathologist, with significant predictive short-term value. The %PSR predicts short term kidney function in univariate study and involves extra-routine and expensive-time work. We think that %PSR must be regarded as a research instrument.
Resumo:
Aim: To characterise clinically the patients with C4d in peritubular capillaries deposits (C4dPTCD) and/or circulating anti-HLA class I/II alloantibodies. To determine the correlation between positive C4dPTCD and circulating anti-HLA class I/II alloantibodies during episodes of graft dysfunction. Subjects and Methods: C4d staining was performed in biopsies with available frozen tissue obtained between January 2004 and December 2006. The study was prospective from March 2005, when a serum sample was obtained at the time of biopsy to detect circulating anti-HLA class I/II alloantibodies. Results: We studied 109 biopsies in 86 cadaver renal transplant patients. Sixteen of these (14.7%) presented diffuse positive C4dPTCD. There was a 13.5% rate of +C4dPTCD incidence within the first six months of transplantation and 16% after six months (p>0.05). Half of the +C4dPTCD in the first six months was associated with acute humoral rejection. After six months, the majority of +C4dPTCD (n=7/8) was present in biopsies with evidence of interstitial fibrosis/tubular atrophy and/or transplant glomerulopathy. The C4dPTCD was more frequent in patients with positive anti-HCV antibodies(p<0.0001), a previous renal transplant (p=0.007), and with a panel reactivity antibody (PRA) ≥ 50%(p=0.0098). The anti-HCV+ patients had longer time on dialysis (p=0.0019) and higher PRA(p=0.005). Circulating anti-HLA I/II alloantibodies were screened in 46 serum samples. They were positive in 10.9% of samples, all obtained after six months post transplant. Circulating alloantibodies were absent in 92.5% of the C4d negative biopsies. Conclusion: We found an association between the presence of C4dPTCD and 2nd transplant recipients,higher PRA and the presence of anti-HCV antibodies. The presence of HCV antibodies is not a risk factor for C4dPTCD per se, but appears to reflect longer time on dialysis and presensitisation. In renal dysfunction a negative alloantibody screening is associated with a reduced risk of C4dPTCD (<10%).
Resumo:
Introduction: The clinical importance of humoral-mediated acute rejection has been progressively recognised. Early recognition and treatment with plasmapheresis and intravenous immunoglobulin have recently improved short term prognosis. Case report: In this report we describe the clinical features of three 2nd transplant patients developing severe acute humoral rejection during the first week post-transplant while on anti-thymocyte globulin therapy. Treatment with plasmapheresis/ intravenous immunoglobulin/rituximab resulted in rapid reversal of oliguria,and recovery of renal function within the 1st week of treatment in 2/3 patients. Diagnosis was confirmed by graft biopsies revealing peritubular neutrophiles and C4d deposits. Sequential graft biopsies in all three patients revealed complete histological recovery within two weeks. One patient never recovered renal function, and one patient lost his graft at three months following hemorrhagic shock. After 2 years follow up, the remaining patient maintains a serum creatinine of 1.1mg/dl. Conclusion: The regimen using plasmapheresis plus intravenous immunoglobulin and rituximab was effective in rapidly reversing severe acute humoral rejection.
Resumo:
Background:Primary graft dysfunction is the main cause of early mortality after heart transplantation. Mechanical circulatory support has been used to treat this syndrome.Objective:Describe the experience with extracorporeal membrane oxygenation to treat post-transplant primary cardiac graft dysfunction.Methods:Between January 2007 and December 2013, a total of 71 orthotopic heart transplantations were performed in patients with advanced heart failure. Eleven (15.5%) of these patients who presented primary graft dysfunction constituted the population of this study. Primary graft dysfunction manifested in our population as failure to wean from cardiopulmonary bypass in six (54.5%) patients, severe hemodynamic instability in the immediate postoperative period with severe cardiac dysfunction in three (27.3%), and cardiac arrest (18.2%). The average ischemia time was 151 ± 82 minutes. Once the diagnosis of primary graft dysfunction was established, we installed a mechanical circulatory support to stabilize the severe hemodynamic condition of the patients and followed their progression longitudinally.Results:The average duration of extracorporeal membrane oxygenation support was 76 ± 47.4 hours (range 32 to 144 hours). Weaning with cardiac recovery was successful in nine (81.8%) patients. However, two patients who presented cardiac recovery did not survive to hospital discharge.Conclusion:Mechanical circulatory support with central extracorporeal membrane oxygenation promoted cardiac recovery within a few days in most patients.