413 resultados para Gingiva - immunology
Resumo:
Introduction. Biliary atresia (BA) is the leading indication for orthotopic liver transplantation (OLT) among children. However, there are technical difficulties, including the limited dimensions of anatomical structures, hypoplasia and/or thrombosis of the portal vein and previous portoenterostomy procedures. Objective. The objective of this study was to present our experience of 239 children with BA who underwent OLT between September 1989 and June 2010 compared with OLT performed for other causes. Methods. We performed a retrospective analysis of patient charts and analysis of complications and survival. Results. BA was the most common indication for OLT (207/409; 50.6%). The median age of subjects was 26 months (range, 7-192). Their median weight was 11 kg (range, 5-63) with 110 children (53.1%) weighing <= 10 kg. We performed 126 transplantations from cadaveric donors (60.8%) and 81 from living-related donors (LRD) (39.2%). Retransplantation was required for 31 recipients (14.9%), primarily due to hepatic artery thrombosis (HAT; 64.5%). Other complications included the following: portal vein thrombosis (PVT; 13.0%), biliary stenosis and/or fistula (22.2%), bowel perforation (7.0%), and posttransplantation lymphoproliferative disorder (PTLD; 5.3%). Among the cases of OLT for other causes, the median age of recipients was 81 months (range, 11-17 years), which was higher than that for children with BA. Retransplantation was required in 3.5% of these patients (P < .05), mostly due to HAT. The incidences of PVT, bowel perforation, and PTLD were significantly lower (P < .05). There was no significant difference between biliary complications in the 2 groups. The overall survival rates at 1 versus 5 years were 79.7% versus 68.1% for BA, and 81.2% versus 75.7% for other causes, respectively. Conclusions. Children who undergo OLT for BA are younger than those engrafted for other causes, displaying a higher risk of complications and retransplantations.
Resumo:
Background/Purpose. Posttransplantation portal vein thrombosis (PVT) can have severe health consequences, and portal hypertension and other consequences of the long-term privation of portal inflow to the graft may be hazardous, especially in young children. The Rex shunt has been used successfully to treat PVT patients since 1998. In 2007, we started to perform this surgery in patients with idiopathic PVT and late posttransplantation PVT. Herein we have reported our experience with this technique in acute posttransplantation PVT. Methods. Three patients of ages 12, 15, and 18 months underwent cadaveric (n = 1) or living donor (n = 2) orthotopic liver transplantation (OLT). All patients had biliary atresia with portal vein hypoplasia; they developed acute PVT on the first postoperative day. They underwent a mesenteric-portal surgical shunt (Rex shunt) using a left internal jugular vein autograft (n = 2) or cadaveric iliac vein graft (n = 1) on the first postoperative day. Results. The 8-month follow-up has confirmed shunt patency by postoperative Doppler ultrasound. There have been no biliary complications to date. Conclusions. The mesenteric-portal shunt (Rex shunt) using an autograft of the left internal jugular or a cadaveric vein graft should be considered for children with acute PVT after OLT. These children usually have small portal veins; reanastomosis is often unsuccessful. In addition, this technique has the advantage to avoid manipulation of the hepatic hilum and biliary anastomosis. Although this study was based on a limited experience, we concluded that this technique is feasible, with great benefits to and low risks for these patients.
Resumo:
The present study aimed to evaluate the effects of immunization with soluble amastigote (AmaAg) and promastigote (ProAg) antigens from Leishmania (Viannia) shawi on the course of infection in BALB/c mice. After immunization with AmaAg, the challenged group showed greater lesion size and parasite load in the skin and lymph nodes, associated with diminished interleukin (IL)-2, IL-4, IL-10, interferon (IFN)-gamma and nitrate levels in the supernatant of lymph node cell cultures, together with increases in transforming growth factor (TGF)-beta concentrations and humoral immune response. In contrast, immunization with ProAg led to smaller lesion size with reduced numbers of viable parasites in the skin. Protection was associated with increases in IL-12, IFN-gamma, TGF-beta and nitrates and decreases in IL-4 and IL-10 levels. Concerning humoral immune response, a significant reduction in anti-leishmania immunoglobulin G was verified in the ProAg-challenged group. Analysis of these results suggests that AmaAg induced a suppressive cellular immune response in mice, favouring the spread of infection, whereas ProAg induced partial protection associated with increased cellular immune response.
Resumo:
Multiple sclerosis (MS) is an autoimmune disease characterized by inflammatory immune response directed against myelin antigens of the central nervous system. In its murine model, EAE, Th17 cells play an important role in disease pathogenesis. These cells can induce blood-brain barrier disruption and CNS immune cells activation, due to the capacity to secrete high levels of IL-17 and IL-22 in an IL-6 + TGF-beta dependent manner. Thus, using the oral tolerance model, by which 200 mu g of MOG 35-55 is given orally to C57BL/6 mice prior to immunization, we showed that the percentage of Th17 cells as well as IL-17 secretion is reduced both in the periphery and also in the CNS of orally tolerated animals. Altogether, our data corroborates with the pathogenic role of IL-17 and IFN-gamma in EAE, as its reduction after oral tolerance, leads to an overall reduction of pro-inflammatory cytokines, such as IL-1 alpha, IL-6, IL-9, IL-12p70 and the chemokines MIP-1 beta, RANTES, Eotaxin and KC in the CNS. It is noteworthy that this was associated to an increase in IL-10 levels. Thus, our data clearly show that disease suppression after oral tolerance induction, correlates with reduction in target organ inflammation, that may be caused by a reduced Th1/Th17 response. Crown Copyright (c) 2010 Published by Elsevier B.V. All rights reserved.
Resumo:
Background. Abdominal hernias are a common disease among cirrhotic patients, because of malnutrition and persistently high intra-abdominal pressure due to ascites. When tense ascites is present, life-threatening complications are likely to occur. In such cases, the morbidity and mortality rates are high. Objective. We describe 3 cirrhotic patients with rare complicated hernias that needed surgical repair. We discuss optimal timing for surgical approaches and the necessity of ascites control before surgery, as well as the technical details of the procedures. Method. Review of hospital charts of selected rare cases of herniae in cirrhotic patients. Conclusion. Elective surgical approaches can treat even uncommon hernias in cirrhotic patients with good results.
Resumo:
Posttransplantation lymphoproliferative disorder (PTLD) is a serious complication following solid organ transplantation that has been linked to Epstein-Barr virus (EBV) infection. The aim of this article was to describe a single-center experience with the multiplicity of clinical presentations of PTLD. Among 350 liver transplantations performed in 303 children, 13 survivor children displayed a histological diagnosis of PTLD (13/242 survivors; 5.4%). The age at diagnosis ranged from 12 to 258 months (median, 47), and the time from transplantation ranged from 1 to 84 months (median, 13). Ten of these children (76.9%) were EBV-naive prior to transplantation. Fever was present in all cases. The clinical signs at presentation were anemia (92.3%), diarrhea and vomiting (69.2%), recurrent upper airway infections (38.4%), Waldeyer ring lymphoid tissue hypertrophy (23.0%), abdominal mass lesions (30.7%), massive cervical and mediastinal adenopathy (15.3%), or gastrointestinal and respiratory symptoms (30.7%). One child developed fulminant hepatic allograft failure secondary to graft involvement by PTLD. Polymorphic PTLD was diagnosed in 6 patients; 7 had the diagnosis of lymphoma. Treatment consisted of stopping immunosuppression as well as starting intravenous gancyclovir and anti-CD20 monoclonal antibody therapy. The mortality rate was 53.8%. The clinical presentation of PTLD varied from fever of unknown origin to fulminant hepatic failure. The other symptoms that may be linked to the diagnosis of PTLD are pancytopenia, tonsil and adenoid hypertrophy, cervical or mediastinal lymph node enlargement, as well as abdominal masses. Despite numerous advances, the optimal treatment approach for PTLD is not completely known and the mortality rate is still high.
Resumo:
Background It is noteworthy that there is a clear clinical, epidemiological and pathophysiological association between upper and lower airway inflammation in rhinitis and asthma. Objective The aim of this study was to compare the eosinophil counts in induced sputum and nasal lavage fluids in asthma, checking their association and the accuracy of nasal eosinophilia as a predictor of sputum eosinophilia by a cross-sectional study. Methods The clinical evaluation, asthma control questionnaire (ACQ), pre- and post-bronchodilator spirometry, nasal and sputum sample was performed. The nasal eosinophilia was analysed by a receiver operating curve and logistic regression model. Results In 140 adults, the post-bronchodilator forced expiratory volume in 1 s (FEV(1)) did not differ between patients with or without sputum eosinophilia (0.18). After adjusted for upper airway symptoms, age, ACQ score and post-bronchodilator FEV(1), sputum eosinophilia was associated with 52 times increase in odds of nasal eosinophilia, whereas each 1% increase in bronchodilator response was associated with 7% increase in odds of nasal eosinophilia. Conclusion This study brings further evidence that upper airway diseases are an important component of the asthma syndrome. Furthermore, monitoring of nasal eosinophilia by quantitative cytology may be useful as a surrogate of sputum cytology in as a component of composite measurement for determining airway inflammation.
Resumo:
In organ transplantation, the immunosuppression withdrawal leads, in most cases, to rejection. Nonetheless, a special group of patients maintain stable graft function after complete withdrawal of immunosuppression, achieving a state called ""operational tolerance."" The study of such patients may be important to understand the mechanisms involved in human transplantation tolerance. We compared the profile of CD4(+)CD25(+)Foxp3(+) T cells and the signaling pathways IL-6/STAT3 (signal transducers and activators of transcription) and IL-4/STAT6 in peripheral blood mononuclear cells of four kidney transplant groups: (i) operational tolerance (OT), (ii) chronic allograft nephropathy (CR), (iii) stable graft function under standard immunosuppression (Sta), (iv) stable graft function under low immunosuppression, and (v) healthy individuals. Both CR and Sta displayed lower numbers and percentages of CD4(+)CD25(+)Foxp3(+) T cells compared with all other groups (p < 0.05). The OT patients displayed a reduced activation of the IL-4/STAT6 pathway in monocytes, compared with all other groups (p < 0.05). The lower numbers of CD4(+)CD25(+)Foxp3(+) T cells observed in CR individuals may be a feature of chronic allograft nephropathy. The differential OT signaling profile, with reduced phosphorylation of STAT6, in monocytes` region, suggests that some altered function of STAT6 signaling may be important for the operational tolerance state. Crown copyright (C) 2010 Published by Elsevier Inc. on behalf of American Society for Histocompatibility and Immunogenetics. All rights reserved.
Resumo:
Dendritic cells (DCs)-based vaccine was demonstrated to increase HIV specific cellular immune response; however, in some HIV-infected patients, the response to the vaccine resulted to be not effective. In order to understand if the outcome of the vaccination may be influenced by the host`s genome and natural immunity, we studied the innate immune genome of HIV-infected patients previously vaccinated with DCs. We identified 15 SNPs potentially associated with the response to the immuno-treatment and two SNPs significantly associated with the modulation of the response to the DC vaccine: MBL2 rs10824792 and NOS1 rs693534. These two SNPs were also studied in different ethnic groups (Brazilians, African and Caucasian) of HIV-infected, exposed uninfected and unexposed uninfected subjects. The HIV positive Caucasian patients were also characterized by different disease progressions. Our findings suggest that, independently and/or in addition to other variables. the host`s genome could significantly contribute to the modulation of the response to the DC vaccine. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Introduction. Only about 15% of the potential candidates for lung donation are considered suitable for transplantation. A new method for ex vivo lung perfusion (EVLP) can be used to evaluate and recondition ""marginal,"" nonacceptable lungs. We have herein described an initial experience with ex vivo perfusion of 8 donor lungs deemed nonacceptable. Materials and Methods. After harvesting, the lungs were perfused ex vivo with Steen Solution, an extracellular matrix with high colloid osmotic pressure. A membrane oxygenator connected to the circuit received gas from a mixture of nitrogen and carbon dioxide, maintaining a normal mixed venous blood gas level in the perfusate. The lungs were gradually rewarmed, reperfused, and ventilated for evaluation through analyses of oxygenation capacity, pulmonary vascular resistance (PVR), lung compliance (LC), and biopsy. Results. The arterial oxygen pressure (with inspired oxygen fraction of 100%) increased from a mean of 206 mm Hg in the organ donor at the referring hospital to a mean of 498 mm Hg during the ex vivo evaluation. After 1 hour of EVLP, PVR varied from 440-1454 dynes/sec/cm(5); LC was in the range of 26-90 mL/cmH(2)O. There was no histological deterioration after 10 hours of cold ischemia and 1 hour of EVLP. Conclusions. The ex vivo evaluation model can improve oxygenation capacity of ""marginal"" lungs rejected for transplantation. It has great potential to increase lung donor availability and, possibly, reduce time on the waiting list.
Resumo:
Introduction. Cytomegalovirus (CMV) infection, a common complication in lung transplant (LT) patients, is associated with worse outcomes. Therefore, prophylaxis and surveillance with preemptive treatment is recommended. Objectives. Describe the epidemiology and impact on mortality of CMV infection in LT patients receiving CMV prophylaxis. Methods. Single-center retrospective cohort of LT recipients from August 2003 to March 2008. We excluded patients with survival or follow-up shorter than 30 days. We reviewed medical charts and all CMV pp65 antigen results. Results. Forty-seven patients met the inclusion criteria and 19 (40%) developed a CMV event: eight CMV infections, seven CMV syndromes, and 15 CMV diseases. The mean number of CMV events for each patient was 1.68 +/- 0.88. Twelve patients developed CMV events during prophylaxis (5/12 had CMV serology D+/R-). Forty-six of the 47 patients had at least one episode of acute rejection (mean 2.23 +/- 1.1). Median follow-up was 22 months (range = 3-50). There were seven deaths. Upon univariate analysis, CMV events were related to greater mortality (P = .04), especially if the patient experienced more than two events (P = .013) and if the first event occurred during the first 3 months after LT (P = .003). Nevertheless, a marginally significant relationship between CMV event during the first 3 months after LT and mortality was observed in the multivariate analysis (hazards ratio: 7.46; 95% confidence interval: 0.98-56.63; P = .052). Patients with CMV events more than 3 months post-LT showed the same survival as those who remained CMV-free. Conclusion. Prophylaxis and preemptive treatment are safe and effective; however, the patients who develop CMV events during prophylaxis experience a worse prognosis.
Resumo:
Background. Lung transplantation is the procedure of choice in several end-stage lung diseases. Despite improvements in surgical techniques and immunosuppression, early postoperative complications occur frequently. Objective. To evaluate the pleural inflammatory response after surgery. Patients and Methods. Twenty patients aged 18 to 63 years underwent unilateral or bilateral lung transplantation between August 2006 and March 2008. Proinflammatory cytokines interleukin (IL)-1 beta, IL-6, and IL-8 and vascular endothelial growth factor in pleural fluid and serum were analyzed. For cytokine evaluation, 20-mL samples of pleural fluid and blood (right, left, or both chest cavities) were obtained at 6 hours after surgery and daily until removal of the chest tube or for a maximum of 10 days. Data were analyzed using analysis of variance followed by the Holm-Sidak test. Results. All effusions were exudates according to Light`s criteria. Pleural fluid cytokine concentrations were highest at 6 hours after surgery. Serum concentrations were lower than those in pleural fluid, and IL-1 beta, IL-6, and IL-8 were undetectable at all time points. Conclusions. There is a peak concentration of inflammatory cytokines in the first 6 hours after transplantation, probably reflecting the effects of surgical manipulation. The decrease observed from postoperative day 1 and thereafter suggests the action of the immunosuppression agents and a temporal reduction in pleural inflammation.
Resumo:
Background. Heart transplantation (OHT) has traditionally been contraindicated in the presence of severe pulmonary hypertension (PH), as detected by right heart catheterization. Noninvasive methods are still not reliably accurate to make this evaluation. Objectives. Determine the efficacy of echo Doppler analysis for the diagnosis of severe PH. Methods. One hundred thirty patients (mean age = 42 +/- 15 years, 82 men) showed severe left ventricular dysfunction (mean ejection fraction = 29 +/- 12%; functional class III-IV). We excluded patients with atrial fibrillation, heart failure secondary to congenital disease, and valvulopathy. The pulmonary parameters defined as severe PH were: systolic pulmonary artery pressure (sPAP) >= 60 mm Hg; a mean transpulmonary gradient >= 15; or pulmonary vascular resistance >= 5 Wood units. Patients underwent a right heart catheterization using a Swan-Ganz catheter to measure hemodynamic parameters and to noninvasively estimate right-sided pressures from spectral Doppler recordings of tricuspid regurgitation velocity (right ventricular systolic pressure [RVsP]). A Pearson correlation of sPAP was obtained with RVsP by; the sensitivity of RVsP for the diagnosis of PH was determined by a receiver operating characteristic (ROC) curve. Results. A good correlation between sPAP and RVsP was obtained by Pearson correlation analysis (r = 0.64; 95% confidence interval [CI] 0.50-0.75; P < .001). The ROC curve analysis showed a sensitivity of 100%, a specificity of 37.2%, (95% CI 0.69-0.83, P < .0001) of a RVsP < 45 mm Hg (cutoff) on the exclusion of severe PH. Conclusions. The cutoff of RVsP < 45 mm Hg, on noninvasive echo Doppler evaluation of PH is an efficient method to replace invasive heart catheterization in OHT candidates.
Resumo:
Introduction. Orthotopic heart transplantation renders the recipient denervated. This remodeling of the intrinsic cardiac nervous system should be taken in account during functional evaluation for allograft coronary artery disease. Dobutamine stress echocardiography (DSE) has been used to detect patients at greater risk. The aim of this study was to determine whether patients with various autonomic response levels, and supposed reinnervation patterns, show the same response to DSE. Methods. We studied 20 patients who had survived more than 5 years after orthotopic heart transplantation. All patients underwent a Holter evaluation. We considered patients with low variability to be those with less than a 40-bpm variation from the lowest to highest heart rate, so-called ""noninnenervated"" (group NI). Patients who had 40-bpm or more variation were considered to show high variability and called ""reinnervated"" (group RI). After that, all patients performed an ergometric test and DSE. Results. Groups were defined as NI (n = 9) and RI (n = 11). Ergometric tests confirmed this response with NI patients showing less variability when compared to RI patients (P = .0401). During DSE, patients showed similar median heart rate responses according to the dobutamine dose. Spearmen correlation showed r = 1.0 (P = .016). Conclusions: DES was effective to reach higher heart rates, probably related to catecholamine infusion. These findings may justify a better response when evaluating cardiac allograft vasculopathy in heart transplant patients.
Resumo:
Background. Renal failure is the most important comorbidity in patients with heart transplantation, it is associated with increased mortality. The major cause of renal dysfunction is the toxic effects of calcineurin inhibitors (CNI). Sirolimus, a proliferation signal inhibitor, is an imunossupressant recently introduced in cardiac transplantation. Its nonnephrotoxic properties make it an attractive immunosuppressive agent for patients with renal dysfunction. In this study, we evaluated the improvement in renal function after switching the CNI to sirolimus among patients with new-onset kidney dysfunction after heart transplantation. Methods. The study included orthotopic cardiac transplant (OHT) patients who required discontinuation of CNI due to worsening renal function (creatinine clearance <50 mL/min). We excluded subjects who had another indication for initiation of sirolimus, that is, rejection, malignancy, or allograft vasculopathy. The patients were followed for 6 months. The creatinine clearance (CrCl) was estimated according to the Cockcroft-Gault equation using the baseline weight and the serum creatinine at the time of introduction of sirolimus and 6 months there after. Nine patients were included, 7 (78%) were males and the overall mean age was 60.1 +/- 12.3 years and time since transplantation 8.7 +/- 6.1 years. The allograft was beyond 1 year in all patients. There was a significant improvement in the serum creatinine (2.98 +/- 0.9 to 1.69 +/- 0.5 mg/dL, P = .01) and CrCl (24.9 +/- 6.5 to 45.7 +/- 17.2 mL/min, P = .005) at 6 months follow-up. Conclusion. The replacement of CNI by sirolimus for imunosuppressive therapy for patients with renal failure after OHT was associated with a significant improvement in renal function after 6 months.