242 resultados para kidney infection
Resumo:
Previous studies showed that melatonin or dehydroepiandrosterone (DHEA) enhances the immune response against parasitic pathogens. The present study investigated the in vitro activity of melatonin combined with DHEA in a period of 24 hr during the course of in vivo T. cruzi infection. The in vitro activity of melatonin or DHEA alone, as well as together, were tested for the trypomastigote forms (doses ranging from 0.5 to 128 mu m). In vitro, neither melatonin nor DHEA alone had any activity against trypomastigote forms, although when the highest concentration of combined melatonin and DHEA was used, it was active against the trypomastigote forms of the parasite. However, for this concentration, a quite toxicity on peritoneal macrophages was observed. For in vivo evaluation, male Wistar rats were infected with the Y strain of T. cruzi. They were orally treated with 10 mg/kg body weight/day of melatonin and subcutaneously with 40 mg/kg body weight/day of DHEA. Treatment with melatonin, DHEA and the association showed a significant reduction in the number of blood trypomastigotes during the acute phase of infection as compared to untreated animals (P < 0.05). A significant increase in the number of macrophages and nitric oxide (NO) concentrations were observed during the peak of parasitaemia with melatonin alone or combined with DHEA. However, with DHEA alone the highest concentration of NO was observed (P < 0.05). Moreover, DHEA treatment increased TNF-alpha levels during the infection (P < 0.05). These results show that melatonin, DHEA or the combination of both reduces parasitemia during the acute phase of infection. The combined action of both molecules did not exert a synergic action on the host`s ability to fight infection, and it seems that among all treatments DHEA induces a more efficient immune response.
Resumo:
Mast Cells (MCs) express toll-like receptor 2 (TLR2), a receptor known to be triggered by several major mycobacterial ligands and involved in resistance against Mycobacterium tuberculosis (MTB) infection. This study investigated whether adoptive transfer of TLR2 positive MCs (TLR2(+/+)) corrects the increased susceptibility of TLR2(-/-) mice to MTB infection. TLR2(-/-) mice displayed increased mycobacterial burden, diminished myeloid cell recruitment and proinflammatory cytokine production accompanied by defective granuloma formation. The reconstitution of these mice with TLR2(+/+) MCs, but not TLR2(-/-), confers better control of the infection, promotes the normalization of myeloid cell recruitment associated with reestablishment of the granuloma formation. In addition, adoptive transfer of TLR2(+/+) MC to TLR2(-/-) mice resulted in regulation of the pulmonary levels of IL-beta, IL-6, TNF-alpha, enhanced Th1 response and activated CD8(+) T cell homing to the lungs. Our results suggest that activation of MCs via TLR2 is required to compensate the defect in protective immunity and inability of TLR2(-/-) mice to control MTB infection. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
The aim of this study was to investigate the role of interleukin 12 (IL-12) during Strongyloides venezuelensis infection. IL-12(-/-) and wildtype C57BL/6 mice were subcutaneously infected with 1500 larvae of S. venezuelensis. On days 7, 14, and 21 post-infection, we determined eosinophil and mononuclear cell numbers in the blood and broncoalveolar lavage fluid (BALF), Th2 cytokine secretion in the lung parenchyma, and serum antibody levels. The numbers of eggs in the feces and worm parasites in the duodena were also quantified. The eosinophil and mononuclear cell counts and the concentrations of IL-3, IL-5, IL-10, IL-13, and IgG1 and IgE antibodies increased significantly in infected IL-12(-/-) and wild-type mice as compared with uninfected controls. However, the number of eosinophils and mononuclear cells in the blood and BALF and the Th2 cytokine levels in the lungs of infected IL-12-/- mice were greater than in infected wild-type C57BL/6 mice. In addition, serum IgE and IgG1 levels were also significantly enhanced in the infected mice lacking IL-12. Meanwhile, parasite burden and fecal egg counts were significantly decreased in infected IL-12-/- mice. Together, our results showed that the absence of IL-12 upregulates the Th2 immune response, which is important for control of S. venezuelensis infection. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
We determined the prophylactic effect of both the d-mannose-binding lectin ArtinM extracted from the seeds of Artocarpus integrifolia (jackfruit) and its recombinant counterpart during the course of experimental paracoccidioidomycosis induced in BALB/c mice. Four experimental protocols of prophylaxis were employed to evaluate the most protective regimen of ArtinM administration. It was demonstrated that the best effect was obtained by administration of two ArtinM doses on days 10 and 3 before the challenge with Paracoccidioides brasiliensis. By following this protocol, the lungs of mice that received native or recombinant ArtinM exhibited reduced fungal burden and granuloma incidence. In addition, the protocol augmented contents of IL-12, IFN-gamma, TNF-alpha and NO. On the other hand, the control group consisting of untreated infected mice had higher pulmonary levels of IL-4 and IL-10. In conclusion, prophylaxis with ArtinM significantly reproduces the effect of its therapeutic administration, i.e, it confers resistance to P. brasiliensis infection in mouse models by promoting IL-12 production and favours Th1-immunity.
Resumo:
KM+ is a mannose-binding lectin from Artocarpus integrifolia that induces interleukin (IL)-12 production by macrophages and protective T helper I immune response against Leishmania major infection. in this study, we performed experiments to evaluate the therapeutic activity of jackfruit KM+ (jfKM(+)) and its recombinant counterpart (rKM(+)) in experimental paracoccidioidomycosis. To this end, jfKM(+) or rKM(+) was administered to BALB/c mice 10 days after infection with Paracoccidiodes brasiliensis. Thirty days postinfection, lungs from the KM+-treated mice contained significantly fewer colony-forming units and little to no organized granulomas compared to the controls. In addition, lung homogenates from the KM+-treated mice presented higher levels of nitric oxide, IL-12, interferon-gamma, and tumor necrosis factor-a, whereas higher levels of IL-4 and IL-10 were detected in the control group. With mice deficient in IL-12, Toll-like receptor (TLR) 2, TLR4, or TLR adaptor molecule MyD88, we demonstrated that KM+ led to protection against P. brasiliensis infection through IL-12 production, which was dependent on TLR2. These results demonstrated a beneficial effect of KM+ on the severity of P. brasiliensis infection and may expand its potential use as a novel immunotherapeutic molecule.
Resumo:
P>Antibody-mediated rejection (AMR) requires specific diagnostic tools and treatment and is associated with lower graft survival. We prospectively screened C4d in pancreas (n = 35, in 27 patients) and kidney (n = 33, in 21 patients) for cause biopsies. Serum amylase and lipase, amylasuria, fasting blood glucose (FBG) and 2-h capillary glucose (CG) were also analysed. We found that 27.3% of kidney biopsies and 43% of pancreatic biopsies showed C4d staining (66.7% and 53.3% diffuse in peritubular and interacinar capillaries respectively). Isolated exocrine dysfunction was the main indication for pancreas biopsy (54.3%) and was followed by both exocrine and endocrine dysfunctions (37.1%) and isolated endocrine dysfunction (8.6%). Laboratorial parameters were comparable between T-cell mediated rejection and AMR: amylase 151.5 vs. 149 U/l (P = 0.075), lipase 1120 vs. 1288.5 U/l (P = 0.83), amylasuria variation 46.5 vs. 61% (P = 0.97), FBG 69 vs. 97 mg/dl (P = 0.20) and 2-h CG maximum 149.5 vs. 197.5 mg/dl (P = 0.49) respectively. Amylasuria values after treatment correlated with pancreas allograft loss (P = 0.015). These data suggest that C4d staining should be routinely investigated when pancreas allograft dysfunction is present because of its high detection rate in cases of rejection.
Resumo:
Background: Candiduria is a hospital-associated infection and a daily problem in the intensive care unit. The treatment of asymptomatic candiduria is not well established and the use of amphotericin B bladder irrigation (ABBI) is controversial. The aim of this systematic review was to determine the best place for this therapy in practice. Methods: The databases searched in this study included MEDLINE, EMBASE, Web of Science, and LILACS (January 1960-June 2007). We included manuscripts with data on the treatment of candiduria using ABBI. The studies were classified as comparative, dose-finding, or non-comparative. Results: From 213 studies, nine articles (377 patients) met our inclusion criteria. ABBI showed a higher clearance of the candiduria 24 hours after the end of therapy than fluconazole (odds ratio (OR) 0.57, 95% confidence interval (CI) 0.32-1.00). Fungal culture 5 days after the end of both therapies showed a similar response (OR 1.51, 95% CI 0.81-2.80). The evaluation of ABBI using an intermittent or continuous system of delivery showed an early candiduria clearance (24 hours after therapy) of 80% and 82%, respectively (OR 0.87, 95% CI 0.52-1.36). Candiduria clearance at >5 days after the therapy showed a superior response using continuous bladder irrigation with amphotericin B (OR 0.52, 95% CI 0.29-0.94). The use of continuous ABBI for more than 5 days showed a better result (88% vs. 78%) than ABBI for less than 5 days, but without significance (OR 0.55, 95% CI 0.34-1.04). Conclusion: Although the strength of the results in the underlying literature is not sufficient to allow the drawing of definitive conclusions, ABBI appears to be as effective as fluconazole, but it does not offer systemic antifungal therapy and should only be used for asymptomatic candiduria. (C) 2008 International Society for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Resumo:
Methods. Data from the Beginning and Ending Supportive Therapy for the Kidney (BEST Kidney) study, a prospective observational study from 54 ICUs in 23 countries of critically ill patients with severe AKI, were analysed. The RIFLE class was determined by using observed (o) pre-morbid and estimated (e) baseline SCr values. Agreement was evaluated by correlation coefficients and Bland-Altman plots. Sensitivity analysis by chronic kidney disease (CKD) status was performed. Results. Seventy-six percent of patients (n = 1327) had a pre-morbid baseline SCr, and 1314 had complete data for evaluation. Forty-six percent had CKD. The median (IQR) values were 97 mu mol/L (79-150) for oSCr and 88 mu mol/L (71-97) for eSCr. The oSCr and eSCr determined at ICU admission and at study enrolment showed only a modest correlation (r = 0.49, r = 0.39). At ICU admission and study enrolment, eSCr misclassified 18.8% and 11.7% of patients as having AKI compared with oSCr. Exclusion of CKD patients improved the correlation between oSCr and eSCr at ICU admission and study enrolment (r = 0.90, r = 0.84) resulting in 6.6% and 4.0% being misclassified, respectively. Conclusions. While limited, estimating baseline SCr by the MDRD equation when pre-morbid SCr is unavailable would appear to perform reasonably well for determining the RIFLE categories only if and when pre-morbid GFR was near normal. However, in patients with suspected CKD, the use of MDRD to estimate baseline SCr overestimates the incidence of AKI and should not likely be used. Improved methods to estimate baseline SCr are needed.
Resumo:
Background and objectives Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD. Design, setting, participants, & measurements Seventy-two nondialyzed CKD patients (age 52 +/- 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 +/- 18.2 ml/min per 1.73 m(2)) were studied. VBD and CAC were quantified by computed tomography. Results CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score >= 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 +/- 52 Hounsfield units) correlated inversely with age (r = -0.41, P < 0.001) and calcium score (r = -0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC. Conclusions Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD. Clin J Am Soc Nephrol 6: 1456-1462, 2011. doi: 10.2215/CJN.10061110
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Eight hundred and seventy-nine patients with acute kidney injury were retrospectively studied over year and eleven months for evaluation of urine volume as a risk factor for death. They were divided into five groups, according to the 24 h urine volume (UV): anuric (UV <= 50 mL/24 h, group 1), oliguric (UV > 50 mL/24 h and < 400 mL/24 h, group 2), and non-oliguric (UV >= 400 mL/24 h). Nonoliguric group was subdivided in three subgroups: UV > 400 mL/24 h and <= 1000 mL/24 h (group 3, reference group), UV > 1000 mL/24 h and <= 2000 mL/24 h (group 4), and UV > 2000 mL/24 h (group 5). Linear tendency test (Mantel extension) pointed out a significant increase in mortality with UV decrease (p < 0.001), confirmed by multivariate analysis. Anuric and oliguric patients had increased risk of respectively 95% and 76% times for death compared to controls (p < 0.05). Patients from groups 4 and 5 presented a reduced risk for death of 50% and 70%, respectively, p = 0.004 and p = 0.001. In conclusion, urine volume was a strong independent factor for mortality in this cohort of AKI patients.
Resumo:
Background. The pathogenesis of hyponatraemia caused by fluoxetine (Fx) use in the treatment of depression is not well understood. It has been attributed to a SIADH, although ADH-enhanced plasma level has not yet been demonstrated in all the cases reported in humans. This experiment aimed at investigating the effect of fluoxetine on the kidney and more specifically in the inner medullary collecting duct (IMCD). Methods. ( 1) In vivo study: ( a) 10 rats were injected daily i. p. with 10 mg/kg fluoxetine doses. After 10 days, rats were sacrificed and blood and kidneys were collected. (b) Immunoblotting studies for AQP2 protein expression in the IMCD from injected rats and in IMCD tubules suspension from 10 normal rats incubated with 10(-7) M fluoxetine. ( 2) In vitro microperfusion study: The osmotic water permeability (P-f, mu m/s) was determined in normal rats IMCD (n = 6), isolated and perfused by the standard methods. Results. In vivo study: ( a) Injected rats with fluoxetine lost about 12% body weight; Na+ plasma level decreased from 139.3 +/- 0.78 mEq/1 to 134.9 +/- 0.5 mEq/1 ( p < 0.01) and K+ and ADH plasma levels remained unchanged. ( b) Immunoblotting densitometric analysis of the assays showed an increase in AQP2 protein abundance of about 40%, both in IMCDs from injected rats [ control period (cont) 99.6 +/- 5.2 versus Fx 145.6 +/- 16.9, p < 0.05] and in tubule suspension incubated with fluoxetine ( cont 100.0 +/- 3.5 versus 143.0 +/- 2.0, p < 0.01). In vitro microperfusion study fluoxetine increased Pf in the IMCD in the absence of ADH from the cont 7.24 +/- 2.07 to Fx 15.77 +/- 3.25 ( p < 0.01). Conclusion. After fluoxetine use, the weight and plasma Na+ level decreased, and the K+ and ADH plasma levels remained unchanged, whereas the AQP2 protein abundance and water absorption in the IMCD increased, leading us to conclude that the direct effect of fluoxetine in the IMCD could explain at least in part, the hyponatraemia found sometime after this drug use in humans.
Resumo:
Recently, mild AKI has been considered as a risk factor for mortality in different scenarios. We conducted a retrospective analysis of the risk factors for two distinct definitions of AKI after elective repair of aortic aneurysms. Logistic regression was carried out to identify independent risk factors for AKI ( defined as >= 25% or >= 50% increase in baseline SCr within 48 h after surgery, AKI 25% and AKI 50%, respectively) and for mortality. Of 77 patients studied ( mean age 68 +/- 10, 83% male), 57% developed AKI 25% and 33.7% AKI 50%. There were no differences between AKI and control groups regarding comorbidities and diameter of aneurysms. However, AKI patients needed a supra-renal aortic cross-clamping more frequently and were more severely ill. Overall in-hospital mortality was 27.3%, which was markedly higher in those requiring a supra-renal aortic cross-clamping. The risk factors for AKI 25% were suprarenal aortic cross-clamping ( odds ratio 5.51, 95% CI 1.05-36.12, p = 0.04) and duration of operation for AKI 25% ( OR 6.67, 95% CI 2.23-19.9, p < 0.001). For AKI 50%, in addition to those factors, post-operative use of vasoactive drugs remained as an independent factor ( OR 6.13, 95% CI 1.64-22.8, p = 0.005). The risk factors associated with mortality were need of supra-renal aortic cross-clamping ( OR 9.6, 95% CI 1.37-67.88, p = 0.02), development of AKI 50% ( OR 8.84, 95% CI 1.31-59.39, p = 0.02), baseline GFR lower than 49 mL/min ( OR 17.07, 95% CI 2.00 145.23, p = 0.009), and serum glucose > 118 mg/dL in the post-operative period ( OR 19.99, 95% CI 2.32-172.28, p = 0.006). An increase of at least 50% in baseline SCr is a common event after surgical repair of aortic aneurysms, particularly when a supra-renal aortic cross-clamping is needed. Along with baseline moderate chronic renal failure, AKI is an independent factor contributing to the high mortality found in this scenario.
Resumo:
Acute kidney injury (AKI) is now well recognized as an independent risk factor for increased morbidity and mortality particularly when dialysis is needed. Although renal replacement therapy (RRT) has been used in AKI for more than five decades, there is no standard methodology to predict which AKI patients will need dialysis and who will recover renal function without requiring dialysis. The lack of consensus on what parameters should guide the decision to start dialysis has led to a wide variation in dialysis utilization. A contributing factor is the lack of studies in the modern era evaluating the relationship of timing of dialysis initiation and outcomes. Although listed as one of the top priorities in research on AKI, timing of dialysis initiation has not been included as a factor in large, randomized controlled trials in this area. In this review we will discuss the criteria that have been used to define early vs. late initiation in previous studies on dialysis initiation. In addition, we propose a patient-centered approach to define early and late initiation that could serve as framework for managing patients and for future studies in this area.