958 resultados para End-stage renal failure
Resumo:
Erythropoietin (EPO) has recently been shown to exert important cytoprotective and anti-apoptotic effects in experimental brain injury and cisplatin-induced nephrotoxicity. The aim of the present study was to determine whether EPO administration is also renoprotectivein both in vitro and in vivo models ofischaemic acute renal failure Methods. Primary cultures of human proximal tubule cells (PTCs) were exposed to either vehicle or EPO (6.25–400 IU/ml) in the presence of hypoxia (1% O2), normoxia (21% O2) or hypoxia followed by normoxia for up to 24 h. The end-points evaluated included cell apoptosis (morphology and in situ end labelling [ISEL], viability [lactate dehydrogenase (LDH release)], cell proliferation [proliferating cell nuclear antigen (PCNA)] and DNA synthesis (thymidine incorporation). The effects of EPO pre-treatment (5000 U/kg) on renal morphology and function were also studied in rat models of unilateral and bilateral ischaemia–reperfusion (IR) injury. Results. In the in vitro model, hypoxia (1% O2) induced a significant degree of PTC apoptosis, which was substantially reduced by co-incubation with EPO at 24 h (vehicle 2.5±0.5% vs 25 IU/ml EPO 1.8±0.4% vs 200 IU/ml EPO 0.9±0.2%, n = 9, P
Resumo:
Administration of human recombinant erythropoietin ( EPO) at time of acute ischemic renal injury ( IRI) inhibits apoptosis, enhances tubular epithelial regeneration, and promotes renal functional recovery. The present study aimed to determine whether darbepoetin-alfa ( DPO) exhibits comparable renoprotection to that afforded by EPO, whether pro or antiapoptotic Bcl-2 proteins are involved, and whether delayed administration of EPO or DPO 6 h following IRI ameliorates renal dysfunction. The model of IRI involved bilateral renal artery occlusion for 45 min in rats ( N = 4 per group), followed by reperfusion for 1-7 days. Controls were sham-operated. Rats were treated at time of ischemia or sham operation ( T0), or post-treated ( 6 h after the onset of reperfusion, T6) with EPO ( 5000 IU/kg), DPO ( 25 mu g/kg), or appropriate vehicle by intraperitoneal injection. Renal function, structure, and immunohistochemistry for Bcl-2, Bcl-XL, and Bax were analyzed. DPO or EPO at T0 significantly abrogated renal dysfunction in IRI animals ( serum creatinine for IRI 0.17 +/- 0.05mmol/l vs DPO-IRI 0.08 +/- 0.03mmol/l vs EPO-IRI 0.04 +/- 0.01mmol/l, P = 0.01). Delayed administration of DPO or EPO ( T6) also significantly abrogated subsequent renal dysfunction ( serum creatinine for IRI 0.17 +/- 0.05mmol/l vs DPO-IRI 0.06 +/- 0.01mmol/l vs EPO-IRI 0.03 +/- 0.03mmol/l, P = 0.01). There was also significantly decreased tissue injury ( apoptosis, P < 0.05), decreased proapoptotic Bax, and increased regenerative capacity, especially in the outer stripe of the outer medulla, with DPO or EPO at T0 or T6. These results reaffirm the potential clinical application of DPO and EPO as novel renoprotective agents for patients at risk of ischemic acute renal failure or after having sustained an ischemic renal insult.
Resumo:
Aborigines in remote areas of Australia have much higher rates of renal disease, as well as hypertension and cardiovascular disease, than non-Aboriginal Australians. We compared kidney findings in Aboriginal and non-Aboriginal people in one remote region. Glomerular number and mean glomerular volume were estimated with the disector/fractionator combination in the right kidney of 19 Aborigines and 24 non-Aboriginal people undergoing forensic autopsy for sudden or unexpected death in the Top End of the Northern Territory. Aborigines had 30% fewer glomeruli than non-Aborigines-202000 fewer glomeruli per kidney, or an estimated 404000 fewer per person (P=0.036). Their mean glomerular volume was 27% larger (P=0.016). Glomerular number was significantly correlated with adult height, inferring a relationship with birthweight, which, on average, is much lower in Aboriginal than non-Aboriginal people. Aboriginal people with a history of hypertension had 30% fewer glomeruli than those without-250000 fewer per kidney (P=0.03), or 500000 fewer per person, and their mean glomerular volume was about 25% larger. The lower nephron number in Aboriginal people is compatible with their susceptibility to renal failure. The additional nephron deficit associated with hypertension is compatible with other reports. Lower nephron numbers are probably due in part to reduced nephron endowment, which is related to a suboptimal intrauterine environment. Compensatory glomerular hypertrophy in people with fewer nephrons, while minimizing loss of total filtering surface area, might be exacerbating nephron loss. Optimization of fetal growth should ultimately reduce the florid epidemic of renal disease, hypertension, and cardiovascular disease.
Resumo:
The PAlliative Care in chronic Kidney diSease study (PACKS study) is examining quality of life, decision making and decisional conflict, costs and mortality in patients with advanced chronic kidney disease who have opted for palliative care. It is also exploring the impact of the decision on the quality of life of carers. The study includes adult patients with end stage (stage 5) chronic kidney disease who have opted for palliative care, adult carers of these patients and renal physicians/clinical nurse specialists who have experience of treating patients with end stage chronic kidney disease who have opted for palliative care.
Early initial findings relate to clinician perspectives on patient decisional conflict, in making complex decisions between dialysis and conservative management. Interviews were conducted with nephrologists and clinical nurse specialists across 10 renal centres in the UK. Themes with associated subthemes include “Frequent changing of mind regarding treatment options,” “A paternalistic approach to decision-making and “Intricacy of the decision”. These findings will be presented and recommendations for future research and education made. Clinicians need to take a more patient centered approach to decision-making. Interventions aimed at increasing understanding of renal disease and its treatments may reduce decisional conflict and raise decisional quality but require testing in the renal specialty.
Resumo:
BACKGROUND: The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. There are few reports studying the correlation between MELD and long-term posttransplantation survival. AIM: To assess the value of pretransplant MELD in the prediction of posttransplant survival. METHODS: The adult patients (age >18 years) who underwent liver transplantation were examined in a retrospective longitudinal cohort of patients, through the prospective data base. We excluded acute liver failure, retransplantation and reduced or split-livers. The liver donors were evaluated according to: age, sex, weight, creatinine, bilirubin, sodium, aspartate aminotransferase, personal antecedents, brain death cause, steatosis, expanded criteria donor number and index donor risk. The recipients' data were: sex, age, weight, chronic hepatic disease, Child-Turcotte-Pugh points, pretransplant and initial MELD score, pretransplant creatinine clearance, sodium, cold and warm ischemia times, hospital length of stay, blood requirements, and alanine aminotransferase (ALT >1,000 UI/L = liver dysfunction). The Kaplan-Meier method with the log-rank test was used for the univariable analyses of posttransplant patient survival. For the multivariable analyses the Cox proportional hazard regression method with the stepwise procedure was used with stratifying sodium and MELD as variables. ROC curve was used to define area under the curve for MELD and Child-Turcotte-Pugh. RESULTS: A total of 232 patients with 10 years follow up were available. The MELD cutoff was 20 and Child-Turcotte-Pugh cutoff was 11.5. For MELD score > 20, the risk factors for death were: red cell requirements, liver dysfunction and donor's sodium. For the patients with hyponatremia the risk factors were: negative delta-MELD score, red cell requirements, liver dysfunction and donor's sodium. The regression univariated analyses came up with the following risk factors for death: score MELD > 25, blood requirements, recipient creatinine clearance pretransplant and age donor >50. After stepwise analyses, only red cell requirement was predictive. Patients with MELD score < 25 had a 68.86%, 50,44% and 41,50% chance for 1, 5 and 10-year survival and > 25 were 39.13%, 29.81% and 22.36% respectively. Patients without hyponatremia were 65.16%, 50.28% and 41,98% and with hyponatremia 44.44%, 34.28% and 28.57% respectively. Patients with IDR > 1.7 showed 53.7%, 27.71% and 13.85% and index donor risk <1.7 was 63.62%, 51.4% and 44.08%, respectively. Age donor > 50 years showed 38.4%, 26.21% and 13.1% and age donor <50 years showed 65.58%, 26.21% and 13.1%. Association with delta-MELD score did not show any significant difference. Expanded criteria donors were associated with primary non-function and severe liver dysfunction. Predictive factors for death were blood requirements, hyponatremia, liver dysfunction and donor's sodium. CONCLUSION: In conclusion MELD over 25, recipient's hyponatremia, blood requirements, donor's sodium were associated with poor survival.
Resumo:
Background: Chronic Chagas disease cardiomyopathy (CCC) is an inflammatory dilated cardiomyopathy with a worse prognosis than other cardiomyopathies. CCC occurs in 30 % of individuals infected with Trypanosoma cruzi, endemic in Latin America. Heart failure is associated with impaired energy metabolism, which may be correlated to contractile dysfunction. We thus analyzed the myocardial gene and protein expression, as well as activity, of key mitochondrial enzymes related to ATP production, in myocardial samples of end-stage CCC, idiopathic dilated (IDC) and ischemic (IC) cardiomyopathies. Methodology/Principal Findings: Myocardium homogenates from CCC (N = 5), IC (N = 5) and IDC (N = 5) patients, as well as from heart donors (N = 5) were analyzed for protein and mRNA expression of mitochondrial creatine kinase (CKMit) and muscular creatine kinase (CKM) and ATP synthase subunits aplha and beta by immunoblotting and by real-time RT-PCR. Total myocardial CK activity was also assessed. Protein levels of CKM and CK activity were reduced in all three cardiomyopathy groups. However, total CK activity, as well as ATP synthase alpha chain protein levels, were significantly lower in CCC samples than IC and IDC samples. CCC myocardium displayed selective reduction of protein levels and activity of enzymes crucial for maintaining cytoplasmic ATP levels. Conclusions/Significance: The selective impairment of the CK system may be associated to the loss of inotropic reserve observed in CCC. Reduction of ATP synthase alpha levels is consistent with a decrease in myocardial ATP generation through oxidative phosphorylation. Together, these results suggest that the energetic deficit is more intense in the myocardium of CCC patients than in the other tested dilated cardiomyopathies.
Resumo:
Objectives: To describe current practice for the discontinuation of continuous renal replacement therapy in a multinational setting and to identify variables associated with successful discontinuation. The approach to discontinue continuous renal replacement therapy may affect patient outcomes. However, there is lack of information on how and under what conditions continuous renal replacement therapy is discontinued. Design: Post hoc analysis of a prospective observational study. Setting. Fifty-four intensive care units in 23 countries. Patients: Five hundred twenty-nine patients (52.6%) who survived initial therapy among 1006 patients treated with continuous renal replacement therapy. Interventions: None. Measurements and Main Results., Three hundred thirteen patients were removed successfully from continuous renal replacement therapy and did not require any renal replacement therapy for at least 7 days and were classified as the ""success"" group and the rest (216 patients) were classified as the ""repeat-RRT"" (renal replacement therapy) group. Patients in the ""success"" group had lower hospital mortality (28.5% vs. 42.7%, p < .0001) compared with patients in the ""repeat-RRT"" group. They also had lower creatinine and urea concentrations and a higher urine output at the time of stopping continuous renal replacement therapy. Multivariate logistic regression analysis for successful discontinuation of continuous renal replacement therapy identified urine output (during the 24 hrs before stopping continuous renal replacement therapy: odds ratio, 1.078 per 100 mL/day increase) and creatinine (odds ratio, 0.996 per mu mol/L increase) as significant predictors of successful cessation. The area under the receiver operating characteristic curve to predict successful discontinuation of continuous renal replacement therapy was 0.808 for urine output and 0.635 for creatinine. The predictive ability of urine output was negatively affected by the use of diuretics (area under the receiver operating characteristic curve, 0.671 with diuretics and 0.845 without diuretics). Conclusions. We report on the current practice of discontinuing continuous renal replacement therapy in a multinational setting. Urine output at the time of initial cessation (if continuous renal replacement therapy was the most important predictor of successful discontinuation, especially if occurring without the administration of diuretics. (Crit Care Med 2009; 37:2576-2582)
Resumo:
Purpose: The aim of this study is to evaluate the relationship between timing of renal replacement therapy (RRT) in severe acute kidney injury and clinical outcomes. Methods: This was a prospective multicenter observational study conducted at 54 intensive care units (ICUs) in 23 countries enrolling 1238 patients. Results: Timing of RRT was stratified into ""early"" and ""late"" by median urea and creatinine at the time RRT was started. Timing was also categorized temporally from ICU admission into early (<2 days), delayed (2-5 days), and late (>5 days). Renal replacement therapy timing by serum urea showed no significant difference in crude (63.4% for urea <= 24.2 mmol/L vs 61.4% for urea >24.2 mmol/L; odds ratio [OR], 0.92; 95% confidence interval [CI], 0.73-1.15; P = .48) or covariate-adjusted mortality (OR, 1.25; 95% CI, 0.91-1.70; P = .16). When stratified by creatinine, late RRT was associated with lower crude (53.4% for creatinine >309 mu mol/L vs 71.4% for creatinine <= 309 mu mol/L; OR, 0.46; 95% CI, 0.36-0.58; P < .0001) and covariate-adjusted mortality (OR, 0.51; 95% CI, 0.37-0.69; P < .001).However, for timing relative to ICU admission, late RRT was associated with greater crude (72.8% vs 62.3% vs 59%, P < .001) and covariate-adjusted mortality (OR, 1.95; 95% CI, 1.30-2.92; P = .001). Overall, late RRT was associated with a longer duration of RRT and stay in hospital and greater dialysis dependence. Conclusion: Timing of RRT, a potentially modifiable factor, might exert an important influence on patient survival. However, this largely depended on its definition. Late RRT (days from admission) was associated with a longer duration of RRT, longer hospital stay, and higher dialysis dependence. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
There are few studies on the relationship between the morphology of acute tubular necrosis (ATN) in native kidneys and late functional recovery. Eighteen patients with acute renal failure (ARF) who had undergone renal biopsy were studied. All had the histological diagnosis of ATN and were followed for at least six months. Clinical characteristics of ARF were analyzed, and histological features were semi-quantitatively evaluated (tubular atrophy, interstitial inflammatory infiltrate, interstitial fibrosis, and ATN). According to the maximal GFR achieved during the follow-up, patients were divided into two groups: complete recovery (GFR >= 90 mL/min/1.73 m(2)) and partial recovery (GFR < 90 mL/min/1.73 m(2)). Only 39% of the patients achieved complete recovery. Patients with partial recovery achieved their maximal GFR (63 +/- 9 mL/min/1.73 m(2)) 37 +/- 14 months after ARF, a period of time similar to those patients with complete recovery (i.e., 54 +/- 22 months). Patients with partial recovery had more severe ARF: oliguria was more frequent (90 versus 17%, p < 0.01), and they had higher peak creatinine (13.85 +/- 1.12 versus 8.95 +/- 1.30 mg/dL, p = 0.01), and longer hospitalization (45 +/- 7 versus 20 +/- 4 days, p = 0.03). No single histological parameter was associated with partial recovery, but the sum of all was when expressed as an injury index [4.00 (2.73-5.45) versus 2.00 (1.25-3.31), p < 0.05]. In conclusion, among patients with atypical ATN course, those with more severe ARF and tubule-interstitial lesions are more prone to partial recovery.
Resumo:
Background. Renal failure is the most important comorbidity in patients with heart transplantation, it is associated with increased mortality. The major cause of renal dysfunction is the toxic effects of calcineurin inhibitors (CNI). Sirolimus, a proliferation signal inhibitor, is an imunossupressant recently introduced in cardiac transplantation. Its nonnephrotoxic properties make it an attractive immunosuppressive agent for patients with renal dysfunction. In this study, we evaluated the improvement in renal function after switching the CNI to sirolimus among patients with new-onset kidney dysfunction after heart transplantation. Methods. The study included orthotopic cardiac transplant (OHT) patients who required discontinuation of CNI due to worsening renal function (creatinine clearance <50 mL/min). We excluded subjects who had another indication for initiation of sirolimus, that is, rejection, malignancy, or allograft vasculopathy. The patients were followed for 6 months. The creatinine clearance (CrCl) was estimated according to the Cockcroft-Gault equation using the baseline weight and the serum creatinine at the time of introduction of sirolimus and 6 months there after. Nine patients were included, 7 (78%) were males and the overall mean age was 60.1 +/- 12.3 years and time since transplantation 8.7 +/- 6.1 years. The allograft was beyond 1 year in all patients. There was a significant improvement in the serum creatinine (2.98 +/- 0.9 to 1.69 +/- 0.5 mg/dL, P = .01) and CrCl (24.9 +/- 6.5 to 45.7 +/- 17.2 mL/min, P = .005) at 6 months follow-up. Conclusion. The replacement of CNI by sirolimus for imunosuppressive therapy for patients with renal failure after OHT was associated with a significant improvement in renal function after 6 months.
Resumo:
Most patients with chronic kidney disease experience abnormalities in serum calcium, phosphorus, parathyroid hormone, and vitamin D metabolism. These can lead to vascular calcification (VC), which has been associated with increased risk for cardiovascular disease and mortality. Although hyperphosphatemia is believed to be a risk factor for mortality and VC, no randomized trial was ever designed to demonstrate that lowering phosphate reduces mortality. Nonetheless, binders have been used extensively, and the preponderance of evidence shows that sevelamer slows the development of VC whereas calcium salts do not. Four studies have demonstrated a slower progression of VC with sevelamer than with calcium-containing binders, although a fifth study showed nonsuperiority. Conversely, the results on mortality with sevelamer have been variable, and data on calcium-based binders are nonexistent. Improved survival with sevelamer was demonstrated in a small randomized clinical trial, whereas a larger randomized trial failed to show a benefit. In addition, preclinical models of renal failure and preliminary clinical data on hemodialysis patients suggest a potential benefit for bone with sevelamer. Meanwhile, several randomized and observational studies suggested no improvement in bone density and fracture rate, and a few noted an increase in total and cardiovascular mortality in the general population given calcium supplements. Although additional studies are needed, there are at least indications that sevelamer may improve vascular and bone health and, perhaps, mortality in hemodialysis patients, whereas data on calcium-based binders are lacking. Clin J Am Soc Nephrol 5: S31-S40, 2010. doi: 10.2215/CJN.05880809
Resumo:
Thanks to the technological development in peritoneal dialysis (PD) during the last three decades, the most important problem nowadays for the nephrologists is the maintenance of the long-term function of the peritoneal membrane. Although PD may exert an early survival benefit as compared with hemodialysis (HD), long-term PD is often associated with histopathological alterations in the peritoneal membrane that are linked to peritoneal ultrafiltration deficit and increased mortality risk. These alterations are closely related to the presence of a chronic activated (local and systemic) inflammatory response. PD itself may have other factors associated that could further modulate the inflammatory response, such as the bioincompatibility of dialysis solutions, fluid overload and changes in the body composition. Understanding the pathophysiology of inflammation in PD is essential for the adoption of adequate strategies to improve both membrane and patient survival. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
Renal ischemia/reperfusion (I/R) injury is one of the frequent causes of acute renal failure (ARF) due to the complex, interrelated sequence of events, that result in damage to and death of kidney cells. Cells of the proximal tubular epithelium are especially susceptible to I/R injury, leading to acute tubular necrosis, which plays a pivotal role in the pathogenesis of ARE Several models have been explicated to assess morphological changes, including those of Jabonski et al. and Goujon et al. We compared the 2 models for histopathological evaluation of 30- or 120-minute periods of renal ischemia followed by 24-hour reperfusion in rats. Several changes were observed after application of the 2 models: proximal tubular cell necrosis, loss of brush border, vacuolization, denudation of tubular basement membrane as a consequence of flattening of basal cells, and presence of intratubular exfoliated cells in the lumen of proximal convoluted tubules at various stages of degeneration (karyorexis, kariopyknosis and karyolysis). Evaluating tubular lesions after 2 periods of experimental ischemia with light microscopy allowed us to conclude that the Goujon classification better characterized the main changes in cortical renal tubules after ischemia.
Resumo:
Objective: To evaluate the determinants of total plasma homocysteine levels and their relations with nutritional parameters, inflammatory status, and traditional risk factors for cardiovascular disease in renal failure patients on dialysis treatment. Design: The study was conducted on 70 clinically stable patients, 50 of them on hemodialysis (70% men; 55.3 +/- 14.5 years) and 20 on peritoneal dialysis (50% men; 62 +/- 13.7 years). Patients were analyzed in terms of biochemical parameters (serum lipids, creatinine, homocysteine [Hcy], creatine-kinase [Ck], folic acid, and vitamin B(12)), anthropometric data, markers of inflammatory status (tumor necrosis factor-alpha, C-reactive protein, interleukin-6), and adapted subjective global assessment. Results: The total prevalence of hyperhomocysteinemia (>15 mu mol/L) was 85.7%. Plasma folic acid and plasma vitamin B(12) were within the normal range. Multiple regression analysis (r(2) - 0.20) revealed that the determinants of total Hcy were type of dialysis, creatinine, Ck, folic acid, and total cholesterol. Hcy was positively correlated with albumin and creatinine and negatively correlated with total cholesterol, high density lipoprotein cholesterol, folic acid, and vitamin B(12). Conclusions: The determinants of total Hcy in the study sample were type of dialysis, creatinine, Ck, folic acid, and total cholesterol. Evidently, the small sample size might have had an effect on the statistical analyses and further studies are needed. However, Hcy in patients on dialysis treatment may not have the same effect as observed in the general population. In this respect, the association between malnutrition and inflammation may be a confounding factor in the determination of the true relationship between Hcy, nutritional status, and cardiovascular risk factors in this group. (C) 2011 by the National Kidney Foundation, Inc. All rights reserved.
Resumo:
Background. Cisplatin (CP)-induced renal damage is associated with inflammation. Hydrogen sulphide (H(2)S) is involved in models of inflammation. This study evaluates the effect of DL-propargylglycine (PAG), an inhibitor of endogenous H(2)S formation, on the renal damage induced by CP. Methods. The rats were injected with CP (5 mg/kg, i.p.) or PAG(5 mg/kg twice a day, i.p.) for 4 days, starting 1 h before CP injection. Control rats were injected with 0.15 M NaCl or PAG only. Blood and urine samples were collected 5 days after saline or CP injections for renal function evaluation. The kidneys were removed for tumour necrosis factor (TNF)-alpha quantification, histological, immunohistochemical and Western blot analysis. The cystathionine gamma-lyase (CSE) activity and expression were assessed. The direct toxicity of H(2)S in renal tubular cells was evaluated by the incubation of these cells with NaHS, a donor of H(2)S. Results. CP-treated rats presented increases in plasma creatinine levels and in sodium and potassium fractional excretions associated with tubulointerstitial lesions in the outer medulla. Increased expression of TNF-alpha, macrophages, neutrophils and T lymphocytes, associated with increased H(2)S formation rate and CSE expression, were also observed in the outer medulla from CP-injected rats. All these alterations were reduced by treatment with PAG. A direct toxicity of NaHS for renal tubular epithelial cells was not observed. Conclusions. Treatment with PAG reduces the renal damage induced by CP. This effect seems to be related to the H2S formation and the restriction of the inflammation in the kidneys from PAG+CP-treated rats.