924 resultados para Graft Rejection
Resumo:
Alternative measures to trough concentrations [non-trough concentrations and limited area under the concentration-time curve (AUC)] have been shown to better predict tacrolimus AUC. The aim of this study was to determine if these are also better predictors of adverse outcomes in long term liver transplant recipients. The associations between tacrolimus trough concentrations (C-0), non-trough concentrations (C-1, C-2, C-4, C-6/8), and AUC(0-12) and the occurrence of hypertension, hyperkalaemia, hyperglycaemia and nephrotoxicity were assessed in 34 clinically stable liver transplant patients. The most common adverse outcome was hypertension, prevalence of 36%. Hyperkalaemia and hyperglycaemia had a prevalence of 21% and 13%, respectively. A sequential population pharmacokinetic/pharmacodynamic approach was implemented. No significant association between predicted C-0, C-1, C-2, C-4, C-6/8 or AUC(0-12) and adverse effects could be found. Tacrolimus concentrations and AUC measures were in the same range in patients with and without adverse effects. Measures reported to provide benefit, preventing graft rejection and minimizing acute adverse effects in the early post-transplant period, were not able to predict adverse effects in stable adult liver recipients whose trough concentrations were maintained in the notional target range.
Resumo:
Accelerated graft rejection can be used to determine immune memory in the gorgonian coral swiftia exserta. The extent ofthe persistence of immune memory will be determined in this experiment using replicate sets that are time elapsed from 1, 3, and 6 month. Although corals lack circulatory systems which can be a component of adaptive systemic immunity, this study will attempt to determine whether this gorgonian coral is capable of transmitting immune information throughout its colonial body. Results showed that at each of the time points; one, three, and six months the secondary response group and the primary response group were significantly different (at p=0.001) therefore, demonstrating long term immune memory. While the primary response group and the 3rd party specificity response group were similar, both were significantly different (at p=O. 001) from the secondary response group which shows the response to be specific, with memory applicable to the original antigen. Systemic immunity was not determined to be present for 15 em and one week after initial sensitization.
Resumo:
Introducción: El tacrolimus es el medicamento de elección para evitar el rechazo al injerto hepático. Su dosis se ajusta a partir de los niveles séricos que se toman periódicamente para asegurar rango terapéutico. Además, niveles elevados se asocian con disfunción renal postrasplante. Sin embargo, no hay consenso frente a los niveles adecuados para pacientes con trasplante hepático. Objetivo: Determinar la relación entre los niveles de tacrolimus y la presencia de rechazo agudo al injerto hepático en pacientes con trasplante hepático realizado en la Fundación Cardioinfantil – Instituto de Cardiología (FCI-IC). Determinar la relación entre los niveles de tacrolimus y la TFG en pacientes con trasplante hepático realizado en la FCI-IC. Métodos: Estudio observacional tipo cohorte histórica en pacientes adultos con trasplante hepático realizado en la FCI-IC entre 2009-2014. Resultados: No se encontró una asociación estadísticamente significativa entre los niveles de tacrolimus y la presencia de rechazo agudo, en sus diferentes definiciones (OR=1,02, p=0,14 y OR=1,01, p=0,29) incluso al ajustar por otras covariables (OR=1,03, p=0,10 y OR=1,02, p=0,25). No fue posible corroborar el diagnóstico con biopsia porque no todos la tenían. Si bien la relación entre los niveles de tacrolimus y la TFG fue estadísticamente significativa (p≤0,001), tiene bajo impacto clínico, pues la TFG disminuyó menos de un punto por cada incremento en 1 ng/ml en los niveles de tacrolimus. Conclusiones: Se necesitan más estudios para establecer la relación entre la exposición a tacrolimus y estos desenlaces para definir si es seguro disminuir su dosis con el fin de reducir los eventos adversos.
Resumo:
Antecedentes: El trasplante renal es la mejor alternativa terapéutica para la enfermedad renal crónica terminal. Los medicamentos inmunosupresores previenen el rechazo. El rechazo mediado por anticuerpos es frecuente y disminuye la función y duración del injerto. Objetivo: Evaluar sistemáticamente la evidencia disponible relacionada con la eficacia y seguridad del tratamiento para el rechazo mediado por anticuerpos en pacientes trasplantados renales. Metodologia: Revisión sistemática en bases de datos MEDLINE, EMBASE, Scopus y Biblioteca virtual de la salud. Literatura gris google scholar, google academico, www.clinicaltrialsregister.eu, and https://clinicaltrials.gov/. Búsqueda manual referencias artículos pre-seleccionados así como de revisiones previamente publicadas. Se siguieron las recomendacioes guia PRISMA para la identificacion de artículos potenciales, tamizaje y selección teniendo en cuenta los criterios de inclusion. Extracción datos de acuerdo a las variables, revisión calidad de los artículos elegidos utilizando evaluación riesgo de segos de Cochrane. Resultados: Se seleccionaron 9 ensayos clínicos publicados entre 1980 y 2016, incluyeron 222 pacientes (113 brazo de intervención y 109 en el control), seguimiento promedio 16 meses. Intervenciones evaluadas plasmaféresis, inmunoadsorción y rituximab. Hubo una amplia heterogeneidad en la definición de criterios de inclusión, criterios diagnósticos de rechazo y medidas de evaluación de eficacia de las intervenciones. Tres estudios encontraron diferencias estadísticamente significativas entre los grupos de tratamiento. Conclusiones: La evidencia sobre la eficacia de los tratamientos del rechazo mediado por anticuerpos en injertos renales es de baja calidad. Son necesarios ensayos clínicos controlados para poder definir el tratamiento óptimo de estos pacientes.
Resumo:
Extracellular nucleotides (e.g. ATP, UTP, ADP) are released by activated endothelium, leukocytes and platelets within the injured vasculature and bind specific cell-surface type-2 purinergic (P2) receptors. This process drives vascular inflammation and thrombosis within grafted organs. Importantly, there are also vascular ectonucleotidases i.e. ectoenzymes that hydrolyze extracellular nucleotides in the blood to generate nucleosides (viz. adenosine). Endothelial cell NTPDase1/CD39 has been shown to critically modulate levels of circulating nucleotides. This process tends to limit the activation of platelet and leukocyte expressed P2 receptors and also generates adenosine to reverse inflammatory events. This vascular protective CD39 activity is rapidly inhibited by oxidative reactions, such as is observed with liver ischemia reperfusion injury. In this review, we chiefly address the impact of these signaling cascades following liver transplantation. Interestingly, the hepatic vasculature, hepatocytes and all non-parenchymal cell types express several components co-ordinating the purinergic signaling response. With hepatic and vascular dysfunction, we note heightened P2- expression and alterations in ectonucleotidase expression and function that may predispose to progression of disease. In addition to documented impacts upon the vasculature during engraftment, extracellular nucleotides also have direct influences upon liver function and bile flow (both under physiological and pathological states). We have recently shown that alterations in purinergic signaling mediated by altered CD39 expression have major impacts upon hepatic metabolism, repair mechanisms, regeneration and associated immune responses. Future clinical applications in transplantation might involve new therapeutic modalities using soluble recombinant forms of CD39, altering expression of this ectonucleotidase by drugs and/or using small molecules to inhibit deleterious P2-mediated signaling while augmenting beneficial adenosine-mediated effects within the transplanted liver.
Resumo:
At present, acute vascular rejection (AVR) remains a primary obstacle inhibiting long-term graft survival in the pig-to-non-human primate transplant model. The present study was undertaken to determine whether repetitive injection of low dose Yunnan-cobra venom factor (Y-CVF), a potent complement inhibitor derived from the venom of Naja kaouthia can completely abrogate hemolytic complement activity and subsequently improve the results in a pig-to-rhesus monkey heterotopic heart transplant model. Nine adult rhesus monkeys received a heterotopic heart transplant from wild-type pigs and the recipients were allocated into two groups: group 1 (n = 4) received repetitive injection of low dose Y-CVF until the end of the study and group 2 (n = 5) did not receive Y-CVF. All recipients were treated with cyclosporine A (CsA), cyclophosphamide (CyP) and steroids. Repetitive Y-CVF treatment led to very dramatic fall in CH50 and serum C3 levels (CH50 < 3 units/C3 remained undetectable throughout the experiment) and successfully prevented hyperacute rejection (HAR), while three of five animals in group 2 underwent HAR. However, the continuous suppression of circulating complement did not prevent AVR and the grafts in group 1 survived from 8 to 13 days. Despite undetectable C3 in circulating blood, C3 deposition was present in these grafts. The venular thrombosis was the predominant histopathologic feature of AVR. We conclude that repetitive injection of low dose Y-CVF can be used to continuously suppress circulating complement in a very potent manner and successfully prevent HAR. However, this therapy did not inhibit complement deposition in the graft and failed to prevent AVR. These data suggest that using alternative pig donors [i.e. human decay accelerating factor (hDAF)-transgenic] in combination with the systemic use of complement inhibitors may be necessary to further control complement activation and improve survival in pig-to-non-human primate xenotransplant model.
Resumo:
Background. The success of transplantation is hampered by rejection of the graft by alloreactive T cells. Donor dendritic cells (DC) have been shown to be required for direct priming of immune responses to antigens from major histocompatibility complex-mismatched grafts. However, for immune responses to major histocompatibility complex-matched, minor histocompatibility (H) antigen mismatched grafts, the magnitude of the T-cell response to directly presented antigens is reduced, and the indirect pathway is more important. Therefore, we aimed to investigate the requirement for donor DC to directly present antigen from minor H antigen mismatched skin and hematopoietic grafts.
Resumo:
Purpose. To evaluate the long-term graft survival and complications of flexible, open-loop anterior-chamber intraocular lenses in patients with penetrating keratoplasty for pseudophakic or aphakic bullous keratopathy. Methods. We reviewed charts of all consecutive patients who underwent penetrating keratoplasty for pseudophakic or aphakic bullous keratopathy combined with implantation of a flexible, open-loop, anterior-chamber intraocular lens at our institution between 1983 and 1988. One-hundred one eyes of 99 patients were evaluated. Graft-survival rates were calculated by using the Kaplan-Meier actuarial method. Results. Mean follow-up was 49.8 months (range. 1-144). The probability of graft survival at 1, 2, 4, 6, and 8 years was 93, 87, 78, 65, and 65%, respectively. A total of 25 (24.8%) grafts failed. Progressive corneal edema without signs of rejection was the most common finding in patients with failed grafts (10 eyes, 40%). The most frequent complication observed was newly diagnosed or worsening of preexisting glaucoma (46 eyes, 45.5%). Conclusions. Our long-term results support flexible, open-loop anterior-chamber intraocular lenses as a reasonable option, at the time of penetrating keratoplasty, in patients with pseudophakic and aphakic bullous keratopathy.
Resumo:
Objective: To investigate the effect of socioeconomic deprivation on cornea graft survival in the United Kingdom.
Design: Retrospective cohort study.
Participants: All the recipients (n = 13?644) undergoing their first penetrating keratoplasty (PK) registered on the United Kingdom Transplant Registry between April 1999 and March 2011 were included.
Methods: Data of patients' demographic details, indications, graft size, corneal vascularization, surgical complication, rejection episodes, and postoperative medication were collected at the time of surgery and 1, 2, and 5 years postoperatively. Patients with endophthalmitis were excluded from the study. Patients' home postcodes were used to determine the socioeconomic status using a well-validated deprivation index in the United Kingdom: A Classification of Residential Neighborhoods (ACORN). Kaplan–Meier survival and Cox proportional hazards regression were used to evaluate the influence of ACORN categories on 5-year graft survival, and the Bonferroni method was used to adjust for multiple comparisons.
Main Outcome Measures: Patients' socioeconomic deprivation status and corneal graft failure.
Results: A total of 13?644 patients received their first PK during the study periods. A total of 1685 patients (13.36%) were lost to follow-up, leaving 11?821 patients (86.64%) for analysis. A total of 138 of the 11?821 patients (1.17%) developed endophthalmitis. The risk of graft failure within 5 years for the patients classified as hard-pressed was 1.3 times that of the least deprived (hazard ratio, 1.3; 95% confidence interval, 1.1–1.5; P = 0.003) after adjusting for confounding factors and indications. There were no statistically significant differences between the causes of graft failure and the level of deprivation (P = 0.14).
Conclusions: Patients classified as hard-pressed had an increased risk of graft failure within 5 years compared with the least deprived patients.
Financial Disclosure(s): The author(s) have no proprietary or commercial interest in any materials discussed in this article
Resumo:
In this review, we discuss a paradigm whereby changes in the intragraft microenvironment promote or sustain the development of chronic allograft rejection. A key feature of this model involves the microvasculature including (a) endothelial cell (EC) destruction, and (b) EC proliferation, both of which result from alloimmune leukocyte- and/or alloantibody-induced responses. These changes in the microvasculature likely create abnormal blood flow patterns and thus promote local tissue hypoxia. Another feature of the chronic rejection microenvironment involves the overexpression of vascular endothelial growth factor (VEGF). VEGF stimulates EC activation and proliferation and it has potential to sustain inflammation via direct interactions with leukocytes. In this manner, VEGF may promote ongoing tissue injury. Finally, we review how these events can be targeted therapeutically using mTOR inhibitors. EC activation and proliferation as well as VEGF-VEGFR interactions require PI-3K/Akt/mTOR intracellular signaling. Thus, agents that inhibit this signaling pathway within the graft may also target the progression of chronic rejection and thus promote long-term graft survival.
Resumo:
Objective: We investigated the influence of acute inflammation in skin isograft acceptance. Methods: Two mouse lines selected for maximal (AIR(MAX)) or minimal inflammatory response (AIR(MIN)) were transplanted with syngeneic skin. Cellular infiltrates and cytokine production were measured 1, 3, 7 or 14 days post-transplantation. The percentage of CD4(+) CD25(+) Foxp3(+) cells in the lymph nodes was also evaluated. Results: Grafts were totally accepted in 100% of AIR(MAX) and in 26% of AIR(MIN) mice. In the latter, partial acceptance was observed in 74% of the animals. Emigrated cells were basically PMN and were enhanced in AIR(MAX) transplants. IL-10 production by graft infiltrating cells showed no interline differences. IFN-gamma was increased in AIR(MIN) grafts at day 14 and lower percentages of CD4(+)CD25(+)Foxp3(+) cells in the lymph nodes were observed in these mice. Conclusions: Our data suggest that differences in graft acceptance might be due to a lack of appropriate regulation of the inflammatory response in AIR(MIN) mice compromising the self/non-self recognition.
Resumo:
Baixas doses de irradiação associadas à infusão de células da medula óssea não previnem a ocorrência da reação do enxerto versus hospedeiro após o transplante intestinal. OBJETIVO: Neste estudo foi avaliado a potencial vantagem em estender o regime imunossupressor associado a infusão de células de medula óssea do doador depletadas de células T na prevenção da reação do enxerto versus hospedeiro após o transplante intestinal. MÉTODOS: Transplante heterotópico de intestino delgado foi realizado em ratos Lewis como receptores e da como doadores, distribuídos em cinco grupos de acordo com a duração da imunossupressão, irradiação e do uso de medula óssea normal ou depletada: G1 (n=6), sem irradiação e G2 (n=9), G3 (n=4), G4 (n=5) e G5 (n=6) foram irradiados com 250 rd. Grupos1, 2, 4 e G3 e 5 foram infundidos com 100 x 10(6) células da medula normal e depletada respectivamente. Animais no G1,2,3 foram imunossuprimidos com 1mg/kg/FK506/ IM por cinco dias e G4 e cinco por 15 dias. Anticorpos monoclonais contra células CD3 e colunas magnéticas foram utilizadas para a depleção da medula óssea. Os animais foram examinados para a presença de rejeição, reação do enxerto versus hospedeiro, chimerismo e biópsias intestinais e da pele. RESULTADOS: Rejeição mínima foi observada em todos os grupos; entretanto, a reação do enxerto versus hospedeiro somente nos animais irradiados. Extensão da imunossupressão alterou a gravidade da reação nos animais dos G4 e 5. Rejeição foi a causa mortis no G1 e a reação do enxerto versus hospedeiro nos Grupos 2,3,4 e 5, não controlada com a infusão de medula óssea depletada. O chimerismo total e de células T do doador foi estatisticamente maior nos grupos irradiados em comparação ao G1. CONCLUSÃO: A extensão do regime de imunossupressão associado a baixas doses de irradiação diminui a gravidade da reação do enxerto versus hospedeiro, não abolida pelo uso de medula óssea depletada.
Resumo:
Background Post-transplant anemia is multifactorial and highly prevalent. Some studies have associated anemia with mortality and graft failure. The purpose of this study was to assess whether the presence of anemia at 1 year is an independent risk factor of mortality and graft survival. Methods All patients transplanted at a single center who survived at least 1 year after transplantation and showed no graft loss (n = 214) were included. Demographic and clinical data were collected at baseline and at 1 year. Patients were divided into two groups (anemic and nonanemic) based on the presence of anemia (hemoglobin<130 g/l in men and 120 g/l in women). Results Baseline characteristics such as age, gender, type of donor, CKD etiology, rejection, andmismatches were similar in both groups. Creatinine clearance was similar in both anemic and nonanemic groups (69.32 ± 29.8 × 75.69 ± 30.5 ml/mim; P = 0.17). A Kaplan- Meier plot showed significantly poorer death-censored graft survival in the anemic group, P = 0.003. Multivariate analysis revealed that anemic patients had a hazard ratio for the graft loss of 3.85 (95% CI: 1.49-9.96; P = 0.005). Conclusions In this study, anemia at 1 year was independently associated with death-censored graft survival and anemic patients were 3.8-fold more likely to lose the graft. © 2010 Springer Science+Business Media, B.V.
Resumo:
Glomerulitis and peritubular capillaritis have been recognized as important lesions in acute renal rejection (AR). We studied glomerulitis and peritubular capillaritis in AR by 2 methods and investigated associations with C4d, type/grade of AR, and allograft survival time. Glomerulitis was measured according to Banff scores (glomerulitis by Banff Method [gBM]) and by counting the number of intraglomerular inflammatory cells (glomerulitis by Quantitative Method [gQM]). Capillaritis was classified by the Banff scoring system (peritubular capillaritis by Banff Method [ptcBM]) and by counting the number of cells in peritubular capillaries in 10 high-power fields (hpf; peritubular capillaritis by Quantitative Method [ptcQM]). These quantitative analyses were performed in an attempt to improve our understanding of the role played by glomerulitis and capillaritis in AR. The g0 + g1 group (gBM) associated with negative C4d (P = .02). In peritubular capillaritis, a larger number of cells per 10 hpf in peritubular capillaries (ptcQM) were observed in positive C4d cases (P = .03). The group g2 + g3 (gBM) correlated with graft loss (P = .01). Peritubular capillaritis was not significantly related to graft survival time. Our study showed that the Banff scoring system is the best method to study glomerulitis and observed that the evaluation of capillaritis in routine biopsies is difficult and additional studies are required for a better understanding of its meaning in AR biopsy specimens of renal allografts.
Resumo:
No safe ultrasound (US) parameters have been established to differentiate the causes of graft dysfunction.To define US parameters and identify the predictors of normal graft evolution, delayed graft function (DGF), and rejection at the early period after kidney transplantation.Between June 2012 and August 2013, 79 renal transplant recipients underwent US examination 1-3 days posttransplantation. Resistive index (RI), power Doppler (PD), and RI + PD (quantified PD) were assessed. Patients were allocated into three groups: normal graft evolution, DGF, and rejection.Resistive index of upper and middle segments and PD were higher in the DGF group than in the normal group. ROC curve analysis revealed that RI + PD was the index that best correlated with DGF (cutoff = 0.84). In the high RI + PD group, time to renal function recovery (6.33 +/- A 6.5 days) and number of dialysis sessions (2.81 +/- A 2.8) were greater than in the low RI + PD group (2.11 +/- A 5.3 days and 0.69 +/- A 1.5 sessions, respectively), p = 0.0001. Multivariate analysis showed that high donor final creatinine with a relative risk (RR) of 19.7 (2.01-184.7, p = 0.009) and older donor age (RR = 1.17 (1.04-1.32), p = 0.007) correlated with risk DGF.Quantified PD (RI + PD) was the best DGF predictor. PD quantification has not been previously reported .