111 resultados para ALLOGRAFT SURVIVAL
em Scielo Saúde Pública - SP
Resumo:
A major problem in renal transplantation is identifying a grading system that can predict long-term graft survival. The present study determined the extent to which the two existing grading systems (Banff 97 and chronic allograft damage index, CADI) correlate with each other and with graft loss. A total of 161 transplant patient biopsies with chronic allograft nephropathy (CAN) were studied. The samples were coded and evaluated blindly by two pathologists using the two grading systems. Logistic regression analyses were used to evaluate the best predictor index for renal allograft loss. Patients with higher Banff 97 and CADI scores had higher rates of graft loss. Moreover, these measures also correlated with worse renal function and higher proteinuria levels at the time of CAN diagnosis. Logistic regression analyses showed that the use of angiotensin-converting enzyme inhibitor (ACEI), hepatitis C virus (HCV), tubular atrophy, and the use of mycophenolate mofetil (MMF) were associated with graft loss in the CADI, while the use of ACEI, HCV, moderate interstitial fibrosis and tubular atrophy and the use of MMF were associated in the Banff 97 index. Although Banff 97 and CADI analyze different parameters in different renal compartments, only some isolated parameters correlated with graft loss. This suggests that we need to review the CAN grading systems in order to devise a system that includes all parameters able to predict long-term graft survival, including chronic glomerulopathy, glomerular sclerosis, vascular changes, and severity of chronic interstitial fibrosis and tubular atrophy.
Resumo:
Organ transplantation can be considered as replacement therapy for patients with end-stage organ failure. The percent of one-year allograft survival has increased due, among other factors, to a better understanding of the rejection process and new immunosuppressive drugs. Immunosuppressive therapy used in transplantation prevents activation and proliferation of alloreactive T lymphocytes, although not fully preventing chronic rejection. Recognition by recipient T cells of alloantigens expressed by donor tissues initiates immune destruction of allogeneic transplants. However, there is controversy concerning the relative contribution of CD4+ and CD8+ T cells to allograft rejection. Some animal models indicate that there is an absolute requirement for CD4+ T cells in allogeneic rejection, whereas in others CD4-depleted mice reject certain types of allografts. Moreover, there is evidence that CD8+ T cells are more resistant to immunotherapy and tolerance induction protocols. An intense focal infiltration of mainly CD8+CTLA4+ T lymphocytes during kidney rejection has been described in patients. This suggests that CD8+ T cells could escape from immunosuppression and participate in the rejection process. Our group is primarily interested in the immune mechanisms involved in allograft rejection. Thus, we believe that a better understanding of the role of CD8+ T cells in allograft rejection could indicate new targets for immunotherapy in transplantation. Therefore, the objective of the present review was to focus on the role of the CD8+ T cell population in the rejection of allogeneic tissue.
Resumo:
We studied the effect of oral and portal vein administration of alloantigens on mouse skin allograft survival. Graft receptor BALB/c mice received spleen cells (30, 90, 150 or 375 x 10(6)) from donor C57BL/6 mice intragastrically on three successive days, starting seven days before the skin graft. Allograft survival was significantly increased with the feeding of 150 x 10(6) allogeneic spleen cells by one gavage (median survival of 12 vs 14 days, P <= 0.005) or when 300 x 10(6) cells were given in six gavage (12 vs 14 days, P < 0.04). A similar effect was observed when 150 x 10(6) spleen cells were injected into the portal vein (12 vs 14 days, P <= 0.03). Furthermore, prolonged allograft survival was observed with subcutaneous (12 vs 16 days, P <= 0.002) or systemic (12 vs 15 days, P <= 0.016) application of murine interleukin-4 (IL-4), alone or in combination with spleen cell injection into the portal vein (12 vs 18 days, P <= 0.0018). Taken together, these results showed that tolerance induction with spleen cells expressing fully incompatible antigens by oral administration or intraportal injection partially down-modulates skin allograft rejection. Furthermore, these findings demonstrated for the first time the effect of subcutaneous or systemic IL-4 application on allograft skin survival suggesting its use as a beneficial support therapy in combination with a tolerance induction protocol.
Resumo:
Experimental data and few clinical non-randomized studies have shown that inhibition of the renin-angiotensin system by angiotensin-converting enzyme (ACE) associated or not with the use of mycophenolate mofetil (MMF) could delay or even halt the progression of chronic allograft nephropathy (CAN). In this retrospective historical study, we investigated whether ACE inhibition (ACEI) associated or not with the use of MMF has the same effect in humans as in experimental studies and what factors are associated with a clinical response. A total of 160 transplant patients with biopsy-proven CAN were enrolled. Eighty-one of them were on ACE therapy (G1) and 80 on ACEI_free therapy (G2). Patients were further stratified for the use of MMF. G1 patients showed a marked decrease in proteinuria and stabilized serum creatinine with time. Five-year graft survival after CAN diagnosis was more frequent in G1 (86.9 vs 67.7%; P < 0.05). In patients on ACEI-free therapy, the use of MMF was associated with better graft survival. The use of ACEI therapy protected 79% of the patients against graft loss (OR = 0.079, 95%CI = 0.015-0.426; P = 0.003). ACEI and MMF or the use of MMF alone after CAN diagnosis conferred protection against graft loss. This finding is well correlated with experimental studies in which ACEI and MMF interrupt the progression of chronic allograft dysfunction and injury. The use of ACEI alone or in combination with MMF significantly reduced proteinuria and stabilized serum creatinine, consequently improving renal allograft survival.
Resumo:
Interstitial fibrosis and tubular atrophy (IF/TA) are the most common cause of renal graft failure. Chronic transplant glomerulopathy (CTG) is present in approximately 1.5-3.0% of all renal grafts. We retrospectively studied the contribution of CTG and recurrent post-transplant glomerulopathies (RGN) to graft loss. We analyzed 123 patients with chronic renal allograft dysfunction and divided them into three groups: CTG (N = 37), RGN (N = 21), and IF/TA (N = 65). Demographic data were analyzed and the variables related to graft function identified by statistical methods. CTG had a significantly lower allograft survival than IF/TA. In a multivariate analysis, protective factors for allograft outcomes were: use of angiotensin-converting enzyme inhibitor (ACEI; hazard ratio (HR) = 0.12, P = 0.001), mycophenolate mofetil (MMF; HR = 0.17, P = 0.026), hepatitis C virus (HR = 7.29, P = 0.003), delayed graft function (HR = 5.32, P = 0.016), serum creatinine ≥1.5 mg/dL at the 1st year post-transplant (HR = 0.20, P = 0.011), and proteinuria ≥0.5 g/24 h at the 1st year post-transplant (HR = 0.14, P = 0.004). The presence of glomerular damage is a risk factor for allograft loss (HR = 4.55, P = 0.015). The presence of some degree of chronic glomerular damage in addition to the diagnosis of IF/TA was the most important risk factor associated with allograft loss since it could indicate chronic active antibody-mediated rejection. ACEI and MMF were associated with better outcomes, indicating that they might improve graft survival.
Resumo:
Vascular complications after liver transplantation include oclusion or stenosis at the sites of anastomosis in the hepatic artery, portal vein, and vena cava. Balloon angioplasty of these stenosis carries little risk and is a useful procedure for the treatment of these problems. The purpose of this paper was to assess whether percutaneous transluminal angioplasty can help to prolong allograft survival and impruve allograft function in patient with hepatic artery stenosis after liver transplantation. We report a 43-year-old mate with stenosis of hepatic artery anastomosis after liver transplantation. An abrupt elevation of liver enzymes and serum bilirrubin levels was noted on the fifth postoperative month. The patient underwent percutaneous liver biopsy, which revealed important ductal depletion due to hypoperfusion, even though Doppler ultrasound examination demonstrated arterial flow. An angiogram confirmed severe stenosis of the arterial anastomosis with poor intraparenchymal arterial perfusion pattern. In an attempt to preserve the graft, a percutaneous transluminal angioplasty was performed using microballoons mounted on a hydrophylic micro guidewire. Intervention proceeded without complications. Liver enzimes and bilirrubin levels decreased within twenty-four hours of angioplasty. Normal levels were achieved after one week. Seven month after angioplasty, the patient is in a optimal clinical condition with no signs of graft impairment. We conclude that percutaneous transluminal angioplasty of hepatic artery stenosis after liver transplantation is relatively safe and may help decrease allograft loss.
Resumo:
The use of sirolimus (SRL) in combination with full doses of cyclosporin A (CsA) results in reduced one-year kidney allograft function, which is associated with shorter long-term allograft survival. We determined the effect of reduced CsA exposure on graft function in patients receiving SRL and prednisone. Ninety recipients of living kidney transplants receiving SRL (2 mg/day, po) were compared to 35 recipients receiving azathioprine (AZA, 2 mg kg-1 day-1, po). All patients also received CsA (8-10 mg kg-1 day-1, po) and prednisone (0.5 mg kg-1 day-1). Efficacy end-point was a composite of biopsy-confirmed acute rejection, graft loss, or death at one year. Graft function was measured by creatinine, creatinine clearance, and graft function deterioration between 3 and 12 months (delta1/Cr). CsA concentrations in patients receiving SRL were 26% lower. No differences in one-year composite efficacy end-point were observed comparing SRL and AZA groups (18 vs 20%) or in the incidence of biopsy-proven acute rejection (14.4 and 14.3%). There were no differences in mean ± SD creatinine (1.65 ± 0.46 vs 1.60 ± 0.43 mg/dl, P = 0.48) or calculated creatinine clearances (61 ± 15 vs 62 ± 13 ml/min, P = 0.58) at one year. Mean ± SD delta1/Cr (-11 ± 17 vs -14 ± 15%, P = 0.7) or the percentage of patients with >20% (26 vs 31%, P = 0.6) or >30% delta1/Cr (19 vs 17%, P = 1) did not differ between the two groups. The use of 2-mg fixed oral doses of SRL and reduced CsA exposure was effective in preventing acute rejection and preserving allograft function.
Resumo:
AbstractIntroduction:Cardiac allograft vasculopathy (CAV) is a major limitation for long-term survival of patients undergoing heart transplantation (HT). Some immunosuppressants can reduce the risk of CAV.Objectives:The primary objective was to evaluate the variation in the volumetric growth of the intimal layer measured by intracoronary ultrasound (IVUS) after 1 year in patients who received basiliximab compared with that in a control group.Methods:Thirteen patients treated at a single center between 2007 and 2009 were analyzed retrospectively. Evaluations were performed with IVUS, measuring the volume of a coronary segment within the first 30 days and 1 year after HT. Vasculopathy was characterized by the volume of the intima of the vessel.Results:Thirteen patients included (7 in the basiliximab group and 6 in the control group). On IVUS assessment, the control group was found to have greater vessel volume (120–185.43 mm3 vs. 127.77–131.32 mm3; p = 0.051). Intimal layer growth (i.e., CAV) was also higher in the control group (27.30–49.15 mm3 [∆80%] vs. 20.23–26.69 mm3[∆33%]; p = 0.015). Univariate regression analysis revealed that plaque volume and prior atherosclerosis of the donor were not related to intima growth (r = 0.15, p = 0.96), whereas positive remodeling was directly proportional to the volumetric growth of the intima (r = 0.85, p < 0.001).Conclusion:Routine induction therapy with basiliximab was associated with reduced growth of the intima of the vessel during the first year after HT.
Resumo:
Survival analysis is applied when the time until the occurrence of an event is of interest. Such data are routinely collected in plant diseases, although applications of the method are uncommon. The objective of this study was to use two studies on post-harvest diseases of peaches, considering two harvests together and the existence of random effect shared by fruits of a same tree, in order to describe the main techniques in survival analysis. The nonparametric Kaplan-Meier method, the log-rank test and the semi-parametric Cox's proportional hazards model were used to estimate the effect of cultivars and the number of days after full bloom on the survival to the brown rot symptom and the instantaneous risk of expressing it in two consecutive harvests. The joint analysis with baseline effect, varying between harvests, and the confirmation of the tree effect as a grouping factor with random effect were appropriate to interpret the phenomenon (disease) evaluated and can be important tools to replace or complement the conventional analysis, respecting the nature of the variable and the phenomenon.
Resumo:
OBJECTIVE: To evaluate the influence of sociodemographic, clinical, and epidemiological factors in AIDS patients survival in a reference hospital. METHODS: A sample of 502 adult AIDS patients out of 1,494 AIDS cases registered in a hospital in Fortaleza, Brazil, was investigated between 1986 and 1998. Sixteen cases were excluded due to death at the moment of the AIDS diagnosis and 486 were analyzed in the study. Socioeconomic and clinical epidemiological were the variables studied. Statistical analysis was conducted using the Kaplan-Meier survival analysis and the Cox proportional hazards model. RESULTS: Three hundred and sixty two out of the 486 patients studied took at least one antiretroviral drug and their survival was ten times longer than those who did not take any drug (746 and 79 days, respectively, p <0.001). Patients who took two nucleoside reverse transcriptase inhibitors (NRTI) plus protease inhibitor were found to have higher survival rates (p <0.001). The risk of dying in the first year was significantly lower for patients who took NRTI and a protease inhibitor compared to those who took only NRTI. In addition, this risk was much lower from the second year on (0.10; 95%CI: 0.42-0.23). The risk of dying in the first year was significantly higher for less educated patients (15.58; 95%CI: 6.64-36.58) and those who had two or more systemic diseases (3.03; 95%CI: 1.74-5.25). After the first year post-diagnosis, there was no risk difference for these factors. CONCLUSIONS: Higher education revealed to exert a significant influence in the first-year survival. Antiretroviral drugs had a greater impact in the survival from the second year on. A more aggressive antiretroviral therapy started earlier could benefit those patients.
Resumo:
OBJECTIVE: To assess overall survival of women with cervical cancer and describe prognostic factors associated. METHODS: A total of 3,341 cases of invasive cervical cancer diagnosed at the Brazilian Cancer Institute, Rio de Janeiro, southeastern Brazil, between 1999 and 2004 were selected. Clinical and pathological characteristics and follow-up data were collected. There were performed a survival analysis using Kaplan-Meier curves and a multivariate analysis through Cox model. RESULTS: Of all cases analyzed, 68.3% had locally advanced disease at the time of diagnosis. The 5-year overall survival was 48%. After multivariate analysis, tumor staging at diagnosis was the single variable significantly associated with prognosis (p<0.001). There was seen a dose-response relationship between mortality and clinical staging, ranging from 27.8 to 749.6 per 1,000 cases-year in women stage I and IV, respectively. CONCLUSIONS: The study showed that early detection through prevention programs is crucial to increase cervical cancer survival.
Resumo:
In an attempt to be as close as possible to the infected and treated patients of the endemic areas of schistosomiasis (S. mansoni) and in order to achieve a long period of follow-up, mice were repeatedly infected with a low number of cercariae. Survival data and histological variables such as schistosomal granuloma, portal changes, hepatocellular necrosis, hepatocellular regeneration, schistosomotic pigment, periductal fibrosis and chiefly bile ducts changes were analysed in the infected treated and non treated mice. Oxamniquine chemotherapy in repeatedly infected mice prolonged survival significantly when compared to non-treated animals (chi-square 9.24, p = 0.0024), thus confirming previous results with a similar experimental model but with a shorter term follow-up. Furthermore, mortality decreased rapidly after treatment suggesting an abrupt reduction in the severity of hepatic lesions. A morphological and immunohistochemical study of the liver was carried out. Portal fibrosis, with a pattern resembling human Symmers fibrosis was present at a late phase in the infected animals. Bile duct lesions were quite close to those described in human Mansonian schistosomiasis. Schistosomal antigen was observed in one isolated altered bile duct cell. The pathogenesis of the bile duct changes and its relation to the parasite infection and/or their antigens are discussed.
Resumo:
Systemic disease by Cryptococcus neoformans (C. neoformans) is a common opportunistic infection in immunodeficient patients. Cellular immunity seems to be the most important determinant of resistance. The aim of this study was to assess the effect of recombinant rat interferon gamma (IFN-gamma) in murine cryptococcosis (Balb/c mice infected by IP route with the Rivas strain of C. neoformans), evaluating survival time, macroscopic and microscopic examination of the organs, and massive seeding of brain homogenate. IFN-gamma treatment, at a daily dose of 10,000 IU, did not modify significantly these variables when mice were challenged with a high inoculum (10(7) yeasts) and treatment was delayed to 5 days after infection (median survival 21 days in control mice vs. 23 days in IFN-treated). Another set of experiments suggested that IFN-gamma treatment, at a dose of 10,000 IU/day, begun at the moment of infection could be useful (it prolonged survival from 20 to 28 days, although the difference did not achieve statistical signification). When used simultaneously with infection by 3.5 x 10(5) yeasts, IFN-gamma at 10,000 IU/day for 15 days significantly prolonged survival of mice (p = 0.004). These results suggest that, depending on the experimental conditions, IFN-gamma can improve survival of mice infected with a lethal dose of C. neoformans.
Resumo:
Trypanosoma cruzi, the causative agent of Chagasdisease assumes two distinct forms in vertebrate hosts: circulating trypomastigote and tissular amastigote. This latter form infects predominantly the myocardium, smooth and skeletal muscle, and central nervous system. The present work describes for the first time the detection of amastigote forms of T. cruzi in the renal parenchyma of a kidney graft recipient one month after transplantation. The patient was serologically negative for Chagasdisease and received no blood transfusion prior to transplant. The cadaver donor was from an endemic area for Chagasdisease. The recipient developed the acute form of the disease with detection of amastigote forms of T. cruzi in the renal allograft biopsy and circulating trypomastigote forms. The present report demonstrates that T. cruzi can infect the renal parenchyma. This mode of transmission warrants in endemic areas of Chagasdisease
Resumo:
Opportunistic diseases (OD) are the most common cause of death in AIDS patients. To access the incidence of OD and survival in advanced immunodeficiency, we included 79 patients with AIDS treated at Hospital Evandro Chagas (FIOCRUZ) from September 1997 to December 1999 with at least one CD4 count <=100 cells/mm³. The incidence of OD was analyzed by Poisson's regression, and survival by Kaplan Meier and Cox analysis, considering a retrospective (before CD4 <=100 cells/mm³) and a prospective (after CD4 <=100 cells/mm³) period, and controlling for demographic, clinical and laboratory characteristics. The confidence interval estipulated was 95%. Mean follow-up period was 733 days (CI = 683-782). During the study 9 (11.4%) patients died. Survival from AIDS diagnosis was a mean of 2589 days (CI = 2363-2816) and from the date of the CD4 count CD4 <=100 cells/mm³ was a mean of 1376 (CI = 1181-1572) days. Incidence of OD was 0.51 pp/y before CD4 <= 100 cells/mm³ and 0.29 pp/y after CD4 <= 100 cells/mm³. A lower number of ODs before CD4 < 100 cells/mm³ was associated with lower incidence rates after CD4 <= 100 cells/mm³. AIDS diagnosis based on CD4+ counts <= 200 cells/mm³ was associated with lower incidence rates after CD4 <= 100 cells/mm³. Baseline CD4 counts above 50 cells/mm³ (HR = 0.13) and restoration of baseline CD4+ counts above 100 cells/mm³ (HR = 0.16) were associated with a lower risk of death. Controling both variables, only restoration of baseline counts was statistically significant (HR = 0.22, p = 0.04). We found a very low incidence of OD and long survival after CD4 < 100 cells/mm³. Survival was significantly associated with restoration of baseline CD4 counts above 100 cells/mm³.