200 resultados para Renal allograft survival
em Scielo Saúde Pública - SP
Resumo:
A major problem in renal transplantation is identifying a grading system that can predict long-term graft survival. The present study determined the extent to which the two existing grading systems (Banff 97 and chronic allograft damage index, CADI) correlate with each other and with graft loss. A total of 161 transplant patient biopsies with chronic allograft nephropathy (CAN) were studied. The samples were coded and evaluated blindly by two pathologists using the two grading systems. Logistic regression analyses were used to evaluate the best predictor index for renal allograft loss. Patients with higher Banff 97 and CADI scores had higher rates of graft loss. Moreover, these measures also correlated with worse renal function and higher proteinuria levels at the time of CAN diagnosis. Logistic regression analyses showed that the use of angiotensin-converting enzyme inhibitor (ACEI), hepatitis C virus (HCV), tubular atrophy, and the use of mycophenolate mofetil (MMF) were associated with graft loss in the CADI, while the use of ACEI, HCV, moderate interstitial fibrosis and tubular atrophy and the use of MMF were associated in the Banff 97 index. Although Banff 97 and CADI analyze different parameters in different renal compartments, only some isolated parameters correlated with graft loss. This suggests that we need to review the CAN grading systems in order to devise a system that includes all parameters able to predict long-term graft survival, including chronic glomerulopathy, glomerular sclerosis, vascular changes, and severity of chronic interstitial fibrosis and tubular atrophy.
Resumo:
Interstitial fibrosis and tubular atrophy (IF/TA) are the most common cause of renal graft failure. Chronic transplant glomerulopathy (CTG) is present in approximately 1.5-3.0% of all renal grafts. We retrospectively studied the contribution of CTG and recurrent post-transplant glomerulopathies (RGN) to graft loss. We analyzed 123 patients with chronic renal allograft dysfunction and divided them into three groups: CTG (N = 37), RGN (N = 21), and IF/TA (N = 65). Demographic data were analyzed and the variables related to graft function identified by statistical methods. CTG had a significantly lower allograft survival than IF/TA. In a multivariate analysis, protective factors for allograft outcomes were: use of angiotensin-converting enzyme inhibitor (ACEI; hazard ratio (HR) = 0.12, P = 0.001), mycophenolate mofetil (MMF; HR = 0.17, P = 0.026), hepatitis C virus (HR = 7.29, P = 0.003), delayed graft function (HR = 5.32, P = 0.016), serum creatinine ≥1.5 mg/dL at the 1st year post-transplant (HR = 0.20, P = 0.011), and proteinuria ≥0.5 g/24 h at the 1st year post-transplant (HR = 0.14, P = 0.004). The presence of glomerular damage is a risk factor for allograft loss (HR = 4.55, P = 0.015). The presence of some degree of chronic glomerular damage in addition to the diagnosis of IF/TA was the most important risk factor associated with allograft loss since it could indicate chronic active antibody-mediated rejection. ACEI and MMF were associated with better outcomes, indicating that they might improve graft survival.
Resumo:
Trypanosoma cruzi, the causative agent of Chagasdisease assumes two distinct forms in vertebrate hosts: circulating trypomastigote and tissular amastigote. This latter form infects predominantly the myocardium, smooth and skeletal muscle, and central nervous system. The present work describes for the first time the detection of amastigote forms of T. cruzi in the renal parenchyma of a kidney graft recipient one month after transplantation. The patient was serologically negative for Chagasdisease and received no blood transfusion prior to transplant. The cadaver donor was from an endemic area for Chagasdisease. The recipient developed the acute form of the disease with detection of amastigote forms of T. cruzi in the renal allograft biopsy and circulating trypomastigote forms. The present report demonstrates that T. cruzi can infect the renal parenchyma. This mode of transmission warrants in endemic areas of Chagasdisease
Resumo:
Experimental data and few clinical non-randomized studies have shown that inhibition of the renin-angiotensin system by angiotensin-converting enzyme (ACE) associated or not with the use of mycophenolate mofetil (MMF) could delay or even halt the progression of chronic allograft nephropathy (CAN). In this retrospective historical study, we investigated whether ACE inhibition (ACEI) associated or not with the use of MMF has the same effect in humans as in experimental studies and what factors are associated with a clinical response. A total of 160 transplant patients with biopsy-proven CAN were enrolled. Eighty-one of them were on ACE therapy (G1) and 80 on ACEI_free therapy (G2). Patients were further stratified for the use of MMF. G1 patients showed a marked decrease in proteinuria and stabilized serum creatinine with time. Five-year graft survival after CAN diagnosis was more frequent in G1 (86.9 vs 67.7%; P < 0.05). In patients on ACEI-free therapy, the use of MMF was associated with better graft survival. The use of ACEI therapy protected 79% of the patients against graft loss (OR = 0.079, 95%CI = 0.015-0.426; P = 0.003). ACEI and MMF or the use of MMF alone after CAN diagnosis conferred protection against graft loss. This finding is well correlated with experimental studies in which ACEI and MMF interrupt the progression of chronic allograft dysfunction and injury. The use of ACEI alone or in combination with MMF significantly reduced proteinuria and stabilized serum creatinine, consequently improving renal allograft survival.
Resumo:
The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.
Resumo:
Cytomegalovirus (CMV) is the single most important infectious agent affecting recipients of organ transplants. To evaluate the incidence and the clinical importance of CMV infection in renal transplants in Brazil, 37 patients submitted to renal allograft transplants were tested periodically for the presence of cytomegalovirus DNA in urine using the polymerase chain reaction (PCR), and for the presence of IgM and IgG antibodies against CMV by enzyme-linked immunosorbent assay (ELISA) and indirect immunofluorescence (IIF). The PCR-amplified products were detected by gel electrophoresis and confirmed by dot-blot hybridization with oligonucleotide probes. Thirty-two of the 37 patients (86.4%) were positive by at least one of the three methods. In six patients, PCR was the only test which detected the probable CMV infection. Ten patients had a positive result by PCR before transplantation. In general, the diagnosis was achieved earlier by PCR than by serologic tests. Active infection occurred more frequently during the first four months after transplantation. Sixteen of the 32 patients (50%) with active CMV infection presented clinical symptoms consistent with CMV infection. Five patients without evidence of active CMV infection by the three tests had only minor clinical manifestations during follow-up. Our results indicate that PCR is a highly sensitive procedure for the early detection of CMV infection and that CMV infection in renal transplant patients is a frequent problem in Brazil.
Resumo:
Organ transplantation can be considered as replacement therapy for patients with end-stage organ failure. The percent of one-year allograft survival has increased due, among other factors, to a better understanding of the rejection process and new immunosuppressive drugs. Immunosuppressive therapy used in transplantation prevents activation and proliferation of alloreactive T lymphocytes, although not fully preventing chronic rejection. Recognition by recipient T cells of alloantigens expressed by donor tissues initiates immune destruction of allogeneic transplants. However, there is controversy concerning the relative contribution of CD4+ and CD8+ T cells to allograft rejection. Some animal models indicate that there is an absolute requirement for CD4+ T cells in allogeneic rejection, whereas in others CD4-depleted mice reject certain types of allografts. Moreover, there is evidence that CD8+ T cells are more resistant to immunotherapy and tolerance induction protocols. An intense focal infiltration of mainly CD8+CTLA4+ T lymphocytes during kidney rejection has been described in patients. This suggests that CD8+ T cells could escape from immunosuppression and participate in the rejection process. Our group is primarily interested in the immune mechanisms involved in allograft rejection. Thus, we believe that a better understanding of the role of CD8+ T cells in allograft rejection could indicate new targets for immunotherapy in transplantation. Therefore, the objective of the present review was to focus on the role of the CD8+ T cell population in the rejection of allogeneic tissue.
Resumo:
We studied the effect of oral and portal vein administration of alloantigens on mouse skin allograft survival. Graft receptor BALB/c mice received spleen cells (30, 90, 150 or 375 x 10(6)) from donor C57BL/6 mice intragastrically on three successive days, starting seven days before the skin graft. Allograft survival was significantly increased with the feeding of 150 x 10(6) allogeneic spleen cells by one gavage (median survival of 12 vs 14 days, P <= 0.005) or when 300 x 10(6) cells were given in six gavage (12 vs 14 days, P < 0.04). A similar effect was observed when 150 x 10(6) spleen cells were injected into the portal vein (12 vs 14 days, P <= 0.03). Furthermore, prolonged allograft survival was observed with subcutaneous (12 vs 16 days, P <= 0.002) or systemic (12 vs 15 days, P <= 0.016) application of murine interleukin-4 (IL-4), alone or in combination with spleen cell injection into the portal vein (12 vs 18 days, P <= 0.0018). Taken together, these results showed that tolerance induction with spleen cells expressing fully incompatible antigens by oral administration or intraportal injection partially down-modulates skin allograft rejection. Furthermore, these findings demonstrated for the first time the effect of subcutaneous or systemic IL-4 application on allograft skin survival suggesting its use as a beneficial support therapy in combination with a tolerance induction protocol.
Resumo:
Brazil has the third largest contingent of patients on maintenance hemodialysis (HD) worldwide. However, little is known regarding survival rate and predictors of mortality risk in that population, which are the purposes of this study. A total of 3,082 patients incident on HD, from 2000 to 2004, at 25 dialysis facilities distributed among 7 out of 26 states of Brazil were followed-up until 2009. Patients were 52 ± 16 years-old, 57.8% men, and 20.4%, diabetics. The primary outcome was all causes of mortality. Data were censored at five years of follow-up. The global five-year survival rate was 58.2%. In the Cox proportional model, variables associated with risk of death were: age (hazard ratio - HR = 1.44 per decade, p < 0.0001), diabetes (HR = 1.51, p < 0.0001), serum albumin (HR = 0.76 per g/dL, p = 0.001), creatinine (HR = 0.92 per mg/dL, p < 0.0001), and phosphorus (HR = 1.06 per mg/dL, p = 0.04). The present results show that the mortality rate on HD in this Brazilian cohort was relatively low, but the population is younger and with a lower prevalence of diabetes than the ones reported for developed countries.
Resumo:
Vascular complications after liver transplantation include oclusion or stenosis at the sites of anastomosis in the hepatic artery, portal vein, and vena cava. Balloon angioplasty of these stenosis carries little risk and is a useful procedure for the treatment of these problems. The purpose of this paper was to assess whether percutaneous transluminal angioplasty can help to prolong allograft survival and impruve allograft function in patient with hepatic artery stenosis after liver transplantation. We report a 43-year-old mate with stenosis of hepatic artery anastomosis after liver transplantation. An abrupt elevation of liver enzymes and serum bilirrubin levels was noted on the fifth postoperative month. The patient underwent percutaneous liver biopsy, which revealed important ductal depletion due to hypoperfusion, even though Doppler ultrasound examination demonstrated arterial flow. An angiogram confirmed severe stenosis of the arterial anastomosis with poor intraparenchymal arterial perfusion pattern. In an attempt to preserve the graft, a percutaneous transluminal angioplasty was performed using microballoons mounted on a hydrophylic micro guidewire. Intervention proceeded without complications. Liver enzimes and bilirrubin levels decreased within twenty-four hours of angioplasty. Normal levels were achieved after one week. Seven month after angioplasty, the patient is in a optimal clinical condition with no signs of graft impairment. We conclude that percutaneous transluminal angioplasty of hepatic artery stenosis after liver transplantation is relatively safe and may help decrease allograft loss.
Resumo:
The use of sirolimus (SRL) in combination with full doses of cyclosporin A (CsA) results in reduced one-year kidney allograft function, which is associated with shorter long-term allograft survival. We determined the effect of reduced CsA exposure on graft function in patients receiving SRL and prednisone. Ninety recipients of living kidney transplants receiving SRL (2 mg/day, po) were compared to 35 recipients receiving azathioprine (AZA, 2 mg kg-1 day-1, po). All patients also received CsA (8-10 mg kg-1 day-1, po) and prednisone (0.5 mg kg-1 day-1). Efficacy end-point was a composite of biopsy-confirmed acute rejection, graft loss, or death at one year. Graft function was measured by creatinine, creatinine clearance, and graft function deterioration between 3 and 12 months (delta1/Cr). CsA concentrations in patients receiving SRL were 26% lower. No differences in one-year composite efficacy end-point were observed comparing SRL and AZA groups (18 vs 20%) or in the incidence of biopsy-proven acute rejection (14.4 and 14.3%). There were no differences in mean ± SD creatinine (1.65 ± 0.46 vs 1.60 ± 0.43 mg/dl, P = 0.48) or calculated creatinine clearances (61 ± 15 vs 62 ± 13 ml/min, P = 0.58) at one year. Mean ± SD delta1/Cr (-11 ± 17 vs -14 ± 15%, P = 0.7) or the percentage of patients with >20% (26 vs 31%, P = 0.6) or >30% delta1/Cr (19 vs 17%, P = 1) did not differ between the two groups. The use of 2-mg fixed oral doses of SRL and reduced CsA exposure was effective in preventing acute rejection and preserving allograft function.
Resumo:
Although radical nephrectomy alone is widely accepted as the standard of care in localized treatment for renal cell carcinoma (RCC), it is not sufficient for the treatment of metastatic RCC (mRCC), which invariably leads to an unfavorable outcome despite the use of multiple therapies. Currently, sequential targeted agents are recommended for the management of mRCC, but the optimal drug sequence is still debated. This case was a 57-year-old man with clear-cell mRCC who received multiple therapies following his first operation in 2003 and has survived for over 10 years with a satisfactory quality of life. The treatments given included several surgeries, immunotherapy, and sequentially administered sorafenib, sunitinib, and everolimus regimens. In the course of mRCC treatment, well-planned surgeries, effective sequential targeted therapies and close follow-up are all of great importance for optimal management and a satisfactory outcome.
Resumo:
OBJECTIVE To analyze the cost-effectiveness of treatment regimens with cyclosporine or tacrolimus, five years after renal transplantation.METHODS This cost-effectiveness analysis was based on historical cohort data obtained between 2000 and 2004 and involved 2,022 patients treated with cyclosporine or tacrolimus, matched 1:1 for gender, age, and type and year of transplantation. Graft survival and the direct costs of medical care obtained from the National Health System (SUS) databases were used as outcome results.RESULTS Most of the patients were women, with a mean age of 36.6 years. The most frequent diagnosis of chronic renal failure was glomerulonephritis/nephritis (27.7%). In five years, the tacrolimus group had an average life expectancy gain of 3.96 years at an annual cost of R$78,360.57 compared with the cyclosporine group with a gain of 4.05 years and an annual cost of R$61,350.44.CONCLUSIONS After matching, the study indicated better survival of patients treated with regimens using tacrolimus. However, regimens containing cyclosporine were more cost-effective.
Resumo:
Tuberculosis (TB) was diagnosed in 25 of 466 patients who underwent renal transplant over a period of 15 years. TB developed from 1 month to 9 years post-transplant. In 56% of the cases the onset was within the first post-transplant year. TB affected several isolated or combined organs. Pulmonary involvement was present in 76% of cases, either as isolated pleuro-pulmonary (56%) or associated with other sites (20%). The non-pulmonary sites were: skin, joints, tests, urinary tract, central nervous system and lymphonodules. The diagnosis was confirmed by biopsy in 64% of the cases, by identification of tubercle bacilli in 24% and only at necropsy in 12% Biopsy specimens could be classified in three histological forms: exudative, that occurred in early onset and more severe cases granulomatous in late onset and benign cases; and mixed in intermediate cases. Azathioprine dosages were similar along post-transplant time periods in TB patients and in the control groups; and in TB patients who were cured and who died. The number of steroid treated rejection crises was greater in TB than in the control group. Prednisone doses were higher and the number of rejection crises was greater in TB patients who died than in those who were cured. Fifteen patients were cured and ten died, two of them of causes unrelated to TB. Six of the eight TB-related deaths occurred in the first 6 post-transplant months. The outcome was poor in patients in whom TB arose early in post-transplant period and where the exudative or mixed forms were present; whereas the prognosis was good in patients with late onset and granulomatous form of TB. In one patient TB was transmitted by the allograft.
Resumo:
Several drugs and their associations are being used for adjuvant or complementary chemotherapy with the aim of improving results of gastric cancer treatment. The objective of this study was to verify the impact of these drugs on nutrition and on survival rate after radical treatment of 53 patients with gastric cancer in stage III of the TNM classification. A control group including 28 patients who had only undergone radical resection was compared to a group of 25 patients who underwent the same operative technique followed by adjuvant polychemotherapy with FAM (5-fluorouracil, Adriamycin, and mitomycin C). In this latter group, chemotherapy toxicity in relation to hepatic, renal, cardiologic, neurological, hematologic, gastrointestinal, and dermatological functions was also studied. There was no significant difference on admission between both groups in relation to gender, race, macroscopic tumoral type of tumor according to the Borrmann classification, location of the tumor in the stomach, length of the gastric resection, or response to cutaneous tests on delayed sensitivity. Chemotherapy was started on average, 2.3 months following surgical treatment. Clinical and laboratory follow-up of all patients continued for 5 years. The following conclusions were reached: 1) The nutritional status and incidence of gastrointestinal manifestation were similar in both groups; 2) There was no occurrence of cardiac, renal, neurological, or hepatic toxicity or death due to the chemotherapeutic method per se; 3) Dermatological alterations and hematological toxicity occurred exclusively in patients who underwent polychemotherapy; 4) There was no significant difference between the rate and site of tumoral recurrence, the disease-free interval, or the survival rate of both study groups; 5) Therefore, we concluded, after a 5-year follow-up, chemotherapy with the FAM regimen did not increase the survival rate.