978 resultados para graft failure
Resumo:
OBJECTIVES Fontan failure (FF) represents a growing and challenging indication for paediatric orthotopic heart transplantation (OHT). The aim of this study was to identify predictors of the best mid-term outcome in OHT after FF. METHODS Twenty-year multi-institutional retrospective analysis on OHT for FF. RESULTS Between 1991 and 2011, 61 patients, mean age 15.0 ± 9.7 years, underwent OHT for failing atriopulmonary connection (17 patients = 27.8%) or total cavopulmonary connection (44 patients = 72.2%). Modality of FF included arrhythmia (14.8%), complex obstructions in the Fontan circuit (16.4%), protein-losing enteropathy (PLE) (22.9%), impaired ventricular function (31.1%) or a combination of the above (14.8%). The mean time interval between Fontan completion and OHT was 10.7 ± 6.6 years. Early FF occurred in 18%, requiring OHT 0.8 ± 0.5 years after Fontan. The hospital mortality rate was 18.3%, mainly secondary to infection (36.4%) and graft failure (27.3%). The mean follow-up was 66.8 ± 54.2 months. The overall Kaplan-Meier survival estimate was 81.9 ± 1.8% at 1 year, 73 ± 2.7% at 5 years and 56.8 ± 4.3% at 10 years. The Kaplan-Meier 5-year survival estimate was 82.3 ± 5.9% in late FF and 32.7 ± 15.0% in early FF (P = 0.0007). Late FF with poor ventricular function exhibited a 91.5 ± 5.8% 5-year OHT survival. PLE was cured in 77.7% of hospital survivors, but the 5-year Kaplan-Meier survival estimate in PLE was 46.3 ± 14.4 vs 84.3 ± 5.5% in non-PLE (P = 0.0147). Cox proportional hazards identified early FF (P = 0.0005), complex Fontan pathway obstruction (P = 0.0043) and PLE (P = 0.0033) as independent predictors of 5-year mortality. CONCLUSIONS OHT is an excellent surgical option for late FF with impaired ventricular function. Protein dispersion improves with OHT, but PLE negatively affects the mid-term OHT outcome, mainly for early infective complications.
Resumo:
Gebiet: Chirurgie Abstract: Background: Preservation of cardiac grafts for transplantation is not standardized and most centers use a single administration of crystalloid solution at the time of harvesting. We investigated possible benefits of an additional dose of cardioplegia dispensed immediately before implantation. – – Methods: Consecutive adult cardiac transplantations (2005?2012) were reviewed. Hearts were harvested following a standard protocol (Celsior 2L, 4?8°C). In 2008, 100 ml crys-talloid cardioplegic solution was added and administered immediately before implanta-tion. Univariate and logistic regression analyses were used to investigate risk factors for post-operative graft failure and mid-term outcome. – – Results: A total of 81 patients, 44 standard (?Cardio???) vs. 37 with additional cardiople-gia (?CardioC?) were analyzed. Recipients and donors were comparable in both groups. CardioC patients demonstrated a reduced need for defibrillation (24 vs. 48%, p D0.03), post-operative ratio of CK-MB/CK (10.1_3.9 vs. 13.3_4.2%, p D0.001), intubation time (2.0_1.6 vs. 7.2_11.5 days, p D0.05), and ICU stay (3.9_2.1 vs. 8.5_7.8 days, p D0.001). Actuarial survival was reduced when graft ischemic time was >180 min in Cardio?? but not in CardioC patients (p D0.033). Organ ischemic time >180 min (OR: 5.48, CI: 1.08?27.75), donor female gender (OR: 5.84, CI: 1.13?33.01), and recipient/donor age >60 (OR: 6.33, CI: 0.86?46.75), but not the additional cardioplegia or the observation period appeared independent predictors of post-operative acute graft failure. – – Conclusion: An additional dose of cardioplegia administered immediately before implan-tation may be a simple way to improve early and late outcome of cardiac transplantation, especially in situations of prolonged graft ischemia.A large, ideally multicentric, randomized study is desirable to verify this preliminary observation.
Resumo:
Orthotopic liver transplantation began in Brisbane in January 1985. During the first two years of the programme an assessment committee evaluated 55 patients (38 adults, 17 children). Patients were either accepted for transplantation, rejected as unsuitable or deferred for elective reassessment. All of the 10 adults who were rejected for transplantation because they had 'too advanced' disease died within four months of assessment. Six children who were accepted for transplantation died before a suitable donor liver could be found. In the first two years, 21 orthotopic liver transplantations were performed on 18 patients (adults, 13 patients; children, five patients). Fifteen of 21 grafts were procured from within Queensland. Twelve (67%) patients are alive at three to 23 months and all have been discharged from hospital. Deaths in adults were due to sepsis (three patients), aspiration pneumonitis (one patient), rejection and hepatic artery thrombosis (one patient) and the recurrence of a hepatocellular carcinoma five months after discharge from hospital (one patient). Two patients underwent a second transplantation procedure because of chronic rejection at four months and at 11 months, respectively, after the initial operation. One patient received a second transplant for primary graft failure at four days after the operation. A scoring system which considered the presence of pre-operative patient factors, such as coma, ascites, malnutrition and previous abdominal surgery, partly predicted the operative blood loss and patient survival. In conclusion, orthotopic liver transplantation is being performed in Australia with survival rates that are comparable with those of established overseas units.
Resumo:
The Molecular Adsorbent Recirculating System (MARS) is an extracorporeal albumin dialysis device which is used in the treatment of liver failure patients. This treatment was first utilized in Finland in 2001, and since then, over 200 patients have been treated. The aim of this thesis was to evaluate the impact of the MARS treatment on patient outcome, the clinical and biochemical variables, as well as on the psychological and economic aspects of the treatment in Finland. This thesis encompasses 195 MARS-treated patients (including patients with acute liver failure (ALF), acute-on-chronic liver failure (AOCLF) and graft failure), and a historical control group of 46 ALF patients who did not undergo MARS. All patients received a similar standard medical therapy at the same intensive care unit. The baseline data (demographics, laboratory and clinical variables) and MARS treatment-related and health-related quality-of-life data were recorded before and after treatment. The direct medical costs were determined for a period of 3.5 years.Additionally, the outcome of patients (survival, native liver recovery and need for liver transplantation) and survival predicting factors were investigated. In the outcome analysis, for the MARS-treated ALF patients, their 6-month survival (75% vs. 61%, P=0.07) and their native liver recovery rate (49% vs. 17%, P<0.001) were higher, and their need for transplantations was lower (29% vs. 57%, P= 0.001) than for the historical controls. However, the etiological distribution of the ALF patients referred to our unit has changed considerably over the past decade and the percentage of patients with a more favorable prognosis has increased. The etiology of liver failure was the most important predictor of the outcome. Other survival predicting factors in ALF included hepatic encephalopathy, the coagulation factors and the liver enzyme levels prior to MARS treatment. In terms of prognosis, the MARS treatment of the cirrhotic AOCLF patient seems meaningful only when the patient is eligible for transplantation. The MARS treatment appears to halt the progression of encephalopathy and reduce the blood concentration of neuroactive amino acids, albumin-bound and water-soluble toxins. In general, the effects of the MARS treatment seem to stabilize the patients, thus allowing additional time either for the native liver to recover, or for the patients to endure the prolonged waiting for transplantation. Furthermore, for the ALF patients, the MARS treatment appeared to be less costly and more cost-efficient than the standard medical therapy alone. In conclusion, the MARS treatment appears to have a beneficial effect on the patient outcome in ALF and in those AOCLF patients who can be bridged to transplantation.
Resumo:
Anterior cruciate ligament (ACL) tear is a common sports injury of the knee. Arthroscopic reconstruction using autogenous graft material is widely used for patients with ACL instability. The grafts most commonly used are the patellar and the hamstring tendons, by various fixation techniques. Although clinical evaluation and conventional radiography are routinely used in follow-up after ACL surgery, magnetic resonance imaging (MRI) plays an important role in the diagnosis of complications after ACL surgery. The aim of this thesis was to study the clinical outcome of patellar and hamstring tendon ACL reconstruction techniques. In addition, the postoperative appearance of the ACL graft was evaluated using several MRI sequences. Of the 175 patients who underwent an arthroscopically assisted ACL reconstruction, 99 patients were randomized into patellar tendon (n=51) or hamstring tendon (n=48) groups. In addition, 62 patients with hamstring graft ACL reconstruction were randomized into either cross-pin (n=31) or interference screw (n=31) fixation groups. Follow-up evaluation determined knee laxity, isokinetic muscle performance and several knee scores. Lateral and anteroposterior view radiographs were obtained. Several MRI sequences were obtained with a 1.5-T imager. The appearance and enhancement pattern of the graft and periligamentous tissue, and the location of bone tunnels were evaluated. After MRI, arthroscopy was performed on 14 symptomatic knees. The results revealed no significant differences in the 2-year outcome between the groups. In the hamstring tendon group, the average femoral and tibial bone tunnel diameter increased during 2 years follow-up by 33% and 23%, respectively. In the asymptomatic knees, the graft showed homogeneous and low signal intensity with periligamentous streaks of intermediate signal intensity on T2-weighted MR images. In the symptomatic knees, arthroscopy revealed 12 abnormal grafts and two meniscal tears, each with an intact graft. Among 3 lax grafts visible on arthroscopy, MRI showed an intact graft and improper bone tunnel placement. For diagnosing graft failure, all MRI findings combined gave a specificity of 90% and a sensitivity of 81%. In conclusion, all techniques appeared to improve patients' performance, and were therefore considered as good choices for ACL reconstruction. In follow-up, MRI permits direct evaluation of the ACL graft, the bone tunnels, and additional disorders of the knee. Bone tunnel enlargement and periligamentous tissue showing contrast enhancement were non-specific MRI findings that did not signify ACL deficiency. With an intact graft and optimal femoral bone tunnel placement, graft deficiency is unlikely, and the MRI examination should be carefully scrutinized for possible other causes for the patients symptoms.
Resumo:
OBJECTIVES: Identification of patient subpopulations susceptible to develop myocardial infarction (MI) or, conversely, those displaying either intrinsic cardioprotective phenotypes or highly responsive to protective interventions remain high-priority knowledge gaps. We sought to identify novel common genetic variants associated with perioperative MI in patients undergoing coronary artery bypass grafting using genome-wide association methodology. SETTING: 107 secondary and tertiary cardiac surgery centres across the USA. PARTICIPANTS: We conducted a stage I genome-wide association study (GWAS) in 1433 ethnically diverse patients of both genders (112 cases/1321 controls) from the Genetics of Myocardial Adverse Outcomes and Graft Failure (GeneMAGIC) study, and a stage II analysis in an expanded population of 2055 patients (225 cases/1830 controls) combined from the GeneMAGIC and Duke Perioperative Genetics and Safety Outcomes (PEGASUS) studies. Patients undergoing primary non-emergent coronary bypass grafting were included. PRIMARY AND SECONDARY OUTCOME MEASURES: The primary outcome variable was perioperative MI, defined as creatine kinase MB isoenzyme (CK-MB) values ≥10× upper limit of normal during the first postoperative day, and not attributable to preoperative MI. Secondary outcomes included postoperative CK-MB as a quantitative trait, or a dichotomised phenotype based on extreme quartiles of the CK-MB distribution. RESULTS: Following quality control and adjustment for clinical covariates, we identified 521 single nucleotide polymorphisms in the stage I GWAS analysis. Among these, 8 common variants in 3 genes or intergenic regions met p<10(-5) in stage II. A secondary analysis using CK-MB as a quantitative trait (minimum p=1.26×10(-3) for rs609418), or a dichotomised phenotype based on extreme CK-MB values (minimum p=7.72×10(-6) for rs4834703) supported these findings. Pathway analysis revealed that genes harbouring top-scoring variants cluster in pathways of biological relevance to extracellular matrix remodelling, endoplasmic reticulum-to-Golgi transport and inflammation. CONCLUSIONS: Using a two-stage GWAS and pathway analysis, we identified and prioritised several potential susceptibility loci for perioperative MI.
Resumo:
BACKGROUND.: High serum phosphate has been identified as an important contributor to the vascular calcification seen in patients with chronic kidney disease (Block et al., Am J Kidney Dis 1998; 31: 607). In patients on hemodialysis, elevated serum phosphate levels are an independent predictor of mortality (Block et al., Am J Kidney Dis 1998; 31: 607; Block, Curr Opin Nephrol Hypertens 2001; 10: 741). The aim of this study was to investigate whether an elevated serum phosphate level was an independent predictor of mortality in patients with a renal transplant.
METHODS.: Three hundred seventy-nine asymptomatic renal transplant recipients were recruited between June 2000 and December 2002. Serum phosphate was measured at baseline and prospective follow-up data were collected at a median of 2441 days after enrolment.
RESULTS.: Serum phosphate was significantly higher in those renal transplant recipients who died at follow-up when compared with those who were still alive at follow-up (P<0.001). In Kaplan-Meier analysis, serum phosphate concentration was a significant predictor of mortality (P=0.0001). In multivariate Cox regression analysis, serum phosphate concentration remained a statistically significant predictor of all-cause mortality after adjustment for traditional cardiovascular risk factors, estimated glomerular filtration rate, and high sensitivity C reactive protein (P=0.036) and after adjustment for renal graft failure (P=0.001).
CONCLUSIONS.: The results of this prospective study are the first to show that a higher serum phosphate is a predictor of mortality in patients with a renal transplant and suggest that serum phosphate provides additional, independent, prognostic information to that provided by traditional risk factors in the risk assessment of patients with a renal transplant.
Resumo:
Recent advances in corneal graft technology, including donor tissue retrieval, storage and surgical techniques, have greatly improved the clinical outcome of corneal grafts. Despite these advances, immune mediated corneal graft rejection remains the single most important cause of corneal graft failure. Several host factors have been identified as conferring a "high risk" status to the host. These include: more than two quadrant vascularisation, with associated lymphatics, which augment the afferent and efferent arc of the immune response; herpes simplex keratitis; uveitis; silicone oil keratopathy; previous failed (rejected) grafts; "hot eyes"; young recipient age; and multiple surgical procedures at the time of grafting. Large grafts, by virtue of being closer to the host limbus, with its complement of vessels and antigen-presenting Langerhans cells, also are more susceptible to rejection. The diagnosis of graft rejection is entirely clinical and in its early stages the clinical signs could be subtle. Graft rejection is largely mediated by the major histocompatibility antigens, minor antigens and perhaps blood group ABO antigens and some cornea-specific antigens. Just as rejection is mediated by active immune mediated events, the lack of rejection (tolerance) is also sustained by active immune regulatory mechanisms. The anterior chamber associated immune deviation (ACAID) and probably, conjunctiva associated lymphoid tissue (CALT) induced mucosal tolerance, besides others, play an important role. Although graft rejection can lead to graft failure, most rejections can be readily controlled if appropriate management is commenced at the proper time. Topical steroids are the mainstay of graft rejection management. In the high-risk situations however, systemic steroids, and other immunosuppressive drugs such as cyclosporin and tacrolimus (FK506) are of proven benefit, both for treatment and prevention of rejection.
Resumo:
BACKGROUND: It is now common for individuals to require dialysis following the failure of a kidney transplant. Management of complications and preparation for dialysis are suboptimal in this group. To aid planning, it is desirable to estimate the time to dialysis requirement. The rate of decline in the estimated glomerular filtration rate (eGFR) may be used to this end.
METHODS: This study compared the rate of eGFR decline prior to dialysis commencement between individuals with failing transplants and transplant-naïve patients. The rate of eGFR decline was also compared between transplant recipients with and without graft failure. eGFR was calculated using the four-variable MDRD equation with rate of decline calculated by least squares linear regression.
RESULTS: The annual rate of eGFR decline in incident dialysis patients with graft failure exceeded that of the transplant-naïve incident dialysis patients. In the transplant cohort, the mean annual rate of eGFR decline prior to graft failure was 7.3 ml/min/1.73 m(2) compared to 4.8 ml/min/1.73 m(2) in the transplant-naïve group (p < 0.001) and 0.35 ml/min/1.73 m(2) in recipients without graft failure (p < 0.001). Factors associated with eGFR decline were recipient age, decade of transplantation, HLA mismatch and histological evidence of chronic immunological injury.
CONCLUSIONS: Individuals with graft failure have a rapid decline in eGFR prior to dialysis commencement. To improve outcomes, dialysis planning and management of chronic kidney disease complications should be initiated earlier than in the transplant-naïve population.
Resumo:
We previously reported a randomized trial comparing Cyclosporin-A (CsA) and short-term methotrexate versus CsA alone for graft-versus-host disease (GvHD) prophylaxis in 71 patients undergoing allogeneic haematopoietic stem cell transplantation (HSCT) from a human leucocyte antigen-identical sibling for severe aplastic anaemia (SAA). We found a better survival in the group receiving the two-drug prophylaxis regimen with no significant difference in the probability of developing GvHD between the two groups. The present study details chimaeric analysis and its influence on survival and GvHD occurrence in 45 of the original 71 patients in whom serial samples were available. Analysis was carried out in a blinded prospective manner. Seventy-two per cent achieved complete donor chimaerism (DC), 11% stable mixed chimaerism (SMC) and 17% progressive mixed chimaerism (PMC). The overall 5-year survival probability was 82% (+/-11%) with a significant survival advantage (P = 0.0009) in DC or SMC compared to those with PMC. Chronic GvHD was more frequent in DC patients, whereas no patient with SMC developed chronic GvHD. Graft failure occurred in 50% of the PMC group. This study demonstrates the relevance of chimaerism analysis in patients receiving HSCT for SAA and confirms the occurrence of mixed chimaerism in a significant proportion of recipients.
Resumo:
Secondary or late graft failure has been defined as the development of inadequate marrow function after initial engraftment has been achieved. We describe a case of profound marrow aplasia occurring 13 years after sibling allogeneic bone marrow transplantation for chronic myeloid leukaemia (CML) in first chronic phase. Although the patient remained a complete donor chimera, thereby suggesting that an unselected infusion of donor peripheral blood stem cells (PBSC) or bone marrow might be indicated, the newly acquired aplasia was thought to be immune in aetiology and some immunosuppression was therefore considered appropriate. Rapid haematological recovery was achieved after the infusion of unselected PBSC from the original donor following conditioning with anti-thymocyte globulin (ATG).
Resumo:
Graft-versus-host disease (GVHD) remains a significant complication in patients undergoing allogeneic stem cell transplantation (SCT) using a reduced intensity conditioning regimen. Although T-cell depletion (TCD) reduces the risk of GVHD after a myeloablative conditioning regimen, it is associated with an increased risk of graft failure. We have therefore examined whether TCD compromises engraftment using a fludarabine-based conditioning regimen. Fifteen patients have been transplanted using such a regimen of whom 13 underwent ex vivo TCD. All but one patient demonstrated durable engraftment and no patient receiving a TCD product developed severe GVHD. Thus, TCD may play a role in GvHD prophylaxis using such regimens.
Resumo:
A 3-year old child with juvenile chronic myeloid leukaemia received a T cell-depleted BMT from a male unrelated donor. There was early graft failure associated with increasing splenomegaly and hypersplenism. Splenectomy was performed 53 days post-transplant and was followed by autologous marrow recovery with return of leukaemia. A second unrelated donor BMT was performed 9 months later using T cell-replete marrow from a similarly matched female donor. Grade 2 GVHD involving the skin and gut responded to treatment with steroids. Chimaerism was assessed using Y-specific polymerase chain reaction (PCR) and microsatellites. Samples taken at the time of splenectomy showed no donor marrow engraftment but there was significant engraftment in the spleen. Following the second transplant, donor-type haematopoiesis was documented using a panel of microsatellite probes. The patient remains well 6 months after transplant. Splenectomy should be considered prior to transplant in patients with significant splenomegaly and hypersplenism. Partial chimaerism in the spleen, but not bone marrow, post-BMT, has not previously been documented. PCR technology is a useful and highly sensitive way to assess chimaerism post-BMT and is informative in sex-matched cases, whilst the small amount of material required is advantageous in paediatric patients.
Resumo:
RESUMO: O transplante hepático ortotópico é uma terapêutica aceite para casos selecionados de falência hepática terminal. O procedimento tem-‐se aperfeiçoado, evidenciado pelo aumento da taxa de sobrevida de 30 para 75% aos 5 anos, mas cerca de 13 a 27% dos enxertos desenvolve falência primária (PNF) ou disfunção primária (DF) após o transplante. As consequências são devastadoras para a sobrevida do doente e do enxerto. A sua etiologia é multifactorial, incluindo factores relacionados com o dador e o receptor, tempos de isquémia, agressões cirúrgicas, bem como características anatomopatológicas do enxerto. A lesão de isquémia/reperfusão mantem-‐se como um factor de risco intra operatório, com implicações directas sobre toda a evolução do transplante : existe uma relação íntima entre a PNF e a DF, a preservação do enxerto, a lesão de isquémia/reperfusão, e a falência do transplante. Além disso, está comprovada evidência que sugere que a lesão de I/R torna um aloenxerto mais vulnerável por aumento da imunogenicidade, aumentando a probabilidade de episódios de rejeição precoce e tardia. Com base na prática clínica quotidiana do CHBPT HCC, estudaram-‐se 54 casos de transplante hepático, agrupados segundo grupos por alocação do enxerto respectivo: Grupo 1(n=27): dador cadáver para receptor cirrótico, Grupo 2 (n=15): dador cadáver para receptor PAF, Grupo 3 (n=12): dador PAF para receptor cirrótico. Observaram-‐se as alterações histológicas e moleculares sobre o enxerto até ao final da operação do receptor, e as suas consequências clínicas,avaliando: -‐ As diferentes capacidades de resistência e cada enxerto à lesão de isquémia/reperfusão. -‐ As situações em que os factores do receptor se sobrepõem às do enxerto na definição do prognóstico, e vice versa. -‐ A relevância das lesões histológicas e moleculares precoces no tecido hepático na evolução do enxerto e do receptor. Foram colhidas biópsias por agulha dos 54 enxertos hepáticos,42 provenientes de cadáver com coração batente(morte cerebral) e 12 provenientes de dador vivo com PAF, em três tempos diferentes do processo de colheita e transplante hepático: ‐ A primeira(T0)antes da clampagem da aorta do dador -‐ A segunda (T1) no final da isquémia fria -‐ A terceira (T2) após a reperfusão do enxerto, durante o encerramento da parede abdominal. A estas amostras foi extraído RNA total, convertido em cDNA por transcrição reversa e feita a análise da expressão dos genes da CTLA4, IL-‐1β, IL-‐4, IL-‐6, IL-‐13, TNF-‐α, Perforina, Selectina, (SELE), Fas-‐ligando, Granzima-‐B, Heme-‐Oxigenase 1(HO1)e Óxido Nítrico Sintetase(iNOS2A)por PCR quantitativo segundo o método do Ct comparativo, utilizando como referência a expressão dos genes da amostra não-‐isquémica –T0. Os fragmentos de todas as biópsias foram seccionados, para envio de amostra comparativa para processamento histológico habitual, sem qualquer alteração ao protocolo seguido habitualmente na Unidade de Transplantação do Hospital Curry Cabral. A presença de alguns parâmetros histológicos definidos, como esteatose, necrose, vacuolização, congestão sinusoidal e infiltração neutrofílica, foi registada e contabilizada numa classificação numérica. O seguimento clínico e laboratorial, bem como o acompanhamento de eventuais complicações, foi registado e correlacionado com os dados das colheitas de órgãos e com os dados das biópsias. Foram consideradas as seguintes variáveis, como as mais relevantes e objectivas para a interpretação da evolução clínica, tendo sido comparadas estatisticamente com os dados recolhidos, laboratoriais e clínicos: disfunção do enxerto, 207 pós operatórias, número de internamentos igual ou superior a 2 e rejeição crónica e/ou morte do receptor. Foram identificadas características clínicas menos favoráveis, a considerar, nalgumas circunstâncias: género feminino do receptor (sobretudo associado a enxerto masculino, p=0,077), isquémia fria superior a 500 minutos (p=0,074), isquémia quente superior a 90 minutos (p=0,099). Na análise laboratorial, distinguiram-‐se duas características histológicas desfavoráveis e irreversíveis, como índice de mau prognóstico: a necrose e a balonização (p=0,029); no painel genético escolhido neste estudo,a expressão basal de IL-‐1β(p=0,028), de SELE p=0,013)e de FAS-‐L (p=0,079)relacionaram-‐se com pior prognóstico. Algumas características protectoras intrínsecas dos enxertos só se revelaram indirectamente, como menor infiltração neutrofílica e maior expressão de HO1 e de iNOS nos enxertos PAF, não tendo sido possível provar uma interferência directa nos resultados clínicos. Não se obteve expressão mensurável de genes anti-‐ inflamatórios nas biopsias hepáticas processadas neste estudo, como a IL13 e a I 4: assim, com a metodologia utilizada, não foi possível obter um perfil de expressão genética associado a boa evolução clínica. O perfil inverso foi sugerido apenas pela expressão basal dos 3 genes mencionados (FAS-‐L,IL-‐1β e SELE)no mesmo painel, com o protocolo seguido neste conjunto de 54 doentes. As características do receptor sobrepuseram-‐se às do enxerto no caso de: -‐ diagnóstico de PAF no receptor, que determinou uma maior predisposição para a disfunção do enxerto, o que, por sua vez, determina uma menor sobrevida. No entanto, o diagnóstico de PAF no receptor exibe uma curva de sobrevida mais favorável. -‐ receptores com um baixo balanço de risco (BAR)definiram características favoráveis para enxertos com níveis baixos e moderados de esteatose, fazendo que esta característica, definida como um risco acrescido, não só não se manifestasse clinicamente,como parecesse um factor favorável. As características do enxerto sobrepuseram-‐se às do receptor no caso de: -‐ tempo de isquémia fria superior a 500 minutos -‐ balonização, necrose, FAS-‐L,IL-‐1β e SELE em T0 A integração dos resultados moleculares e morfológicos com a evolução clínica, realça o papel da mobilização precoce de neutrófilos nos desempenhos menos favoráveis do enxerto hepático. -------------ABSTRACT: Orthotopic liver transplantation is na accepted therapeutic procedure for selected cases of terminal liver failure. The procedure has been improved, evidenced by the rise of survival rates from 30 to 70% at 5 years, but 13 to 27% of the liver grafts develops primary non function (PNF) or primary dysfunction (PDF) after transplantation. The consequences are devastating for the survival of the patient and of the graft. Its etiology is multifactorial, including factos related with the donor and with the recipient, ischemic times, surgical aggressions, as well as the histological characteristics of the graft. The ischemia/reperfusion lesion is still an intraoperative risk factor, with direct implications in the whole transplant outcome: there is a close interrelation between PNF and DF, graft preservation, ischemia / reperfusion lesion and graft failure. Beyond his, there is proved evidence that suggests that I/R lesion turns the allograft more vulnerable by increasing its immunogenity, increasing the probability of precocious and late rejection episodes. Based on the daily clinical practice at CHBPT /HCC, 54 cases of hepatic transplantation have been studied, grouped by allocation of each graft: Group (n=27):deceased do nortocirrhotic recipient, Group 2 (n=15): deceased donor to FAP recipient, Group 3 (n=12): FAP living donor to cirrhotic recipient. The histologic and molecular changes in the liver graft were observed until the end of the recipiente operation,together with its clinical consequences, evaluating:-‐The different capacity of resistance of each graft to the ischemia / reperfusion lesion -‐ The situations where the recipiente factos overlap the ones of the graft, in the definition of prognosis, and vice versa.-‐ The relevance of the precocious histologic and molecular lesions of the hepatic tissue in the clinical outcome of the graft and the recipient. Needle biopsies were obtained from 54 liver grafts, 42 deceased brain dead donors and 12 from FAP living donors, at three diferente times of the harvesting and the hepatic transplantation: The first one (T0) before clamping the donor aorta -‐ The second one (T2) in the end of cold ischemia time -‐ The third one (T) after the reperfusion of the graft, during the closure of the abdominal wall. Total RNAwas extracted to these samples, converted to cDNA by reverse transcription and the analysis of gene expression was made for CTLA4,IL-‐1β,IL-‐4,IL-‐6,IL-‐13,TNF-‐α,Perforin,E Selectin (SELE),Fas-‐ligand,Granzyme-‐B,Heme-‐oxigenase 1 (HO1) and Nitric Oxide Sintetase (iNOS2A) by quantitative PCR, according with the Ct comparative method, using the expression of the non ischemic sample – T0. The fragments of all the biopsies were divided, to send a comparative sample to the usual histologic processement, keeping the same usual protocol at the Transplantation Unit of Curry Cabral Hospital. The presence of some defined histologic parameters, such as steatosis, necrosis, vacuolization, sinusoidal congestion and neutrophilic infiltration, was registered and catalogued in a numeric classification. The clinical and laboratory follow-‐up, as well as the following of eventual complications, was registered and correlated with the data from organ procurement operations and with the data from the biopsies. The following variables were considered as the most relevant and objective ones, to the interpretation of the clinical evolution, being statistically compared with the clinical and laboratorial collected data: graft dysfunction, post-‐operative complications, number of readmissions of 2 or more and chronic rejection and /or recipiente death. There were identified some unfavorable clinical characteristics, to be considered under certain circumstances: recipiente female gender (specially associated with malegraft, p=0,077), cold ischemia time of more than 500 minutes (p=0,074), warm ischemia time of more than 90 minutes (p=0,099). In the laboratory analysis, two histologic characteristics were identified as unfavorable and irreversible, associated with bad prognosis: necrosis and balonization (p=0,029); in the gene panel selected in this study, the basal expression of IL-‐1β (p=0,028), SELE (p=0,013) and FAS-‐L (p=0,079)were related with worse prognosis.Some intrinsic protective characteristics of the grafts were only indirectly revealed, such as less neutrophilic infiltration and bigger expression of HO1 and iNOS in FAP grafts, being impossible to prove any direct inte ference in the clinical results. A relevant and measurable expression of the anti inflammatory genes IL13 and IL4 was not obtained: with the used methodology, it was impossible to obtain a gene expression profile associated with a favorable clinical outcome.The inverse profile was suggested only by the basal expression of the three mentioned genes (FAS-‐L, IL-‐ 1β e SELE) in the same gene panel, according with the followed protocol in this group of 54 patients. The characteristics of the recipient overlapped those from the graft, in the case of :-‐ FAP diagnosis in the recipient, which determined a bigger predisposition to graft dysfunction, which by itself determines a shorter survival. However, FAP diagnosis in the recipiente depicts a more favorable survival curve. -‐ Recipients with a low balance risk índex (BAR) defined favorable characteristics to grafts with low and moderate grades of steatosis, making that this characteristic, associated with bad prognosis, looked like a favorable factor, and with no clinical interference. The graft characteristics overlapped those from the receptor in the case of: -‐ Cold ischemic time more than 500 minutes -‐ Balonization, necrosis, FAS-‐L, IL-‐1β and SELE at T0. The integration of molecular and morphologic results with the clinical evolution, stresses the role of a precocious neutrophils mobilization in the worse outcomes of liver grafts.
Resumo:
INTRODUCCION: El trasplante cardiaco es una terapia efectiva para los pacientes con falla cardiaca terminal . Al momento no hay registros Colombianos actualizados y publicados de la morbimortalidad del trasplante cardiaco en relación a los episodios de rechazo celular. MATERIALES Y METODOS: Estudio descriptivo retrospectivo de mayores de 18 años trasplantados en la Clínica Shaio. Se realizaron cálculos de frecuencia, promedios y curvas de Kaplan Meier. RESULTADOS: La edad promedio fue 46,7 años +/- 13.La indicaciones fueron miocardiopatía dilatada 45%, coronaria 30%, miocarditis 9%, Chagas 9%, valvular 7%. El rechazo celular en los tres primeros años: 57%, 54% y 41% que se reduce 25% entre los tres y cinco años. Las causas de muerte fueron falla del trasplante 46%, infecciones y falla cardiaca 23%, rechazo 8%.El primer año la mortalidad fue 65%, principalmente falla del injerto. La mortalidad a 30 días 29,5%, y supervivencia del 90% al año, 64% a 5 años, 48% a 10 años y 15% a 13 años. La supervivencia media fue 8 años. La vasculopatía fue 33%. La función ventricular 53,5% +/- 12,6 el primer año; 58,4 +/- 5,4% al tercer año; 51,7 +/- 11,9 % al quinto año 46% +/- 15,8 al decimo año. DISCUSION: Se observo mayor frecuencia de rechazo celular, ocasionando mas falla cardiaca y muerte.