70 resultados para chronic kidney disease, daily activities of living, haemodialysis, renal nursing, transplantation
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
BACKGROUND/AIMS The use of antihypertensive medicines has been shown to reduce proteinuria, morbidity, and mortality in patients with chronic kidney disease (CKD). A specific recommendation for a class of antihypertensive drugs is not available in this population, despite the pharmacodynamic differences. We have therefore analysed the association between antihypertensive medicines and survival of patients with chronic kidney disease. METHODS Out of 2687 consecutive patients undergoing kidney biopsy a cohort of 606 subjects with retrievable medical therapy was included into the analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy. Main outcome variable was death. RESULTS Overall 114 (18.7%) patients died. In univariate regression analysis the use of alpha-blockers and calcium channel antagonists, progression of disease, diabetes mellitus (DM) type 1 and 2, arterial hypertension, coronary heart disease, peripheral vascular disease, male sex and age were associated with mortality (all p<0.05). In a multivariate Cox regression model the use of calcium channel blockers (HR 1.89), age (HR 1.04), DM type 1 (HR 8.43) and DM type 2 (HR 2.17) and chronic obstructive pulmonary disease (HR 1.66) were associated with mortality (all p < 0.05). CONCLUSION The use of calcium channel blockers but not of other antihypertensive medicines is associated with mortality in primarily GN patients with CKD.
Resumo:
Animal studies suggest that renal tissue hypoxia plays an important role in the development of renal damage in hypertension and renal diseases, yet human data were scarce due to the lack of noninvasive methods. Over the last decade, blood oxygenation level-dependent magnetic resonance imaging (BOLD-MRI), detecting deoxyhemoglobin in hypoxic renal tissue, has become a powerful tool to assess kidney oxygenation noninvasively in humans. This paper provides an overview of BOLD-MRI studies performed in patients suffering from essential hypertension or chronic kidney disease (CKD). In line with animal studies, acute changes in cortical and medullary oxygenation have been observed after the administration of medication (furosemide, blockers of the renin-angiotensin system) or alterations in sodium intake in these patient groups, underlining the important role of renal sodium handling in kidney oxygenation. In contrast, no BOLD-MRI studies have convincingly demonstrated that renal oxygenation is chronically reduced in essential hypertension or in CKD or chronically altered after long-term medication intake. More studies are required to clarify this discrepancy and to further unravel the role of renal oxygenation in the development and progression of essential hypertension and CKD in humans.
Resumo:
BACKGROUND: Kinetic assessment of urea, the main end product of protein metabolism, could serve to assess protein catabolism in dogs with chronic kidney disease (CKD). Protein malnutrition and catabolism are poorly documented in CKD and they often are neglected clinically because of a lack of appropriate evaluation tools. HYPOTHESIS: Generation and excretion of urea are altered in dogs with CKD. ANIMALS: Nine dogs with spontaneous CKD (IRIS stages 2-4) and 5 healthy research dogs. METHODS: Endogenous renal clearance (Clrenal) of urea and creatinine was measured first. Exogenous plasma clearance (Clplasma, total body clearance) of the 2 markers then was determined by an IV infusion of urea (250-1,000 mg/kg over 20 minutes) and an IV bolus of creatinine (40 mg/kg). Extrarenal clearance (Clextra) was defined as the difference between Clplasma)and Clrenal. Endogenous urea generation was computed assuming steady-state conditions. RESULTS: Median Clrenal and Clextra of urea were 2.17 and 0.21 mL/min/kg in healthy dogs and 0.37 and 0.28 mL/min/kg in CKD dogs. The proportion of urea cleared by extrarenal route was markedly higher in dogs with glomerular filtration rate<1 mL/kg/min than in normal dogs, reaching up to 85% of the total clearance. A comparable pattern was observed for creatinine excretion, except in 1 dog, Clextra remained<20% of Clplasma. CONCLUSION: Extrarenal pathways of urea excretion are predominant in dogs with advanced CKD and justify exploring adjunctive therapies based on enteric nitrogen excretion in dogs. A trend toward increased urea generation may indicate increased catabolism in advanced CKD.
Resumo:
OBJECTIVES: The aim of this study was to determine whether the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI)- or Cockcroft-Gault (CG)-based estimated glomerular filtration rates (eGFRs) performs better in the cohort setting for predicting moderate/advanced chronic kidney disease (CKD) or end-stage renal disease (ESRD). METHODS: A total of 9521 persons in the EuroSIDA study contributed 133 873 eGFRs. Poisson regression was used to model the incidence of moderate and advanced CKD (confirmed eGFR < 60 and < 30 mL/min/1.73 m(2) , respectively) or ESRD (fatal/nonfatal) using CG and CKD-EPI eGFRs. RESULTS: Of 133 873 eGFR values, the ratio of CG to CKD-EPI was ≥ 1.1 in 22 092 (16.5%) and the difference between them (CG minus CKD-EPI) was ≥ 10 mL/min/1.73 m(2) in 20 867 (15.6%). Differences between CKD-EPI and CG were much greater when CG was not standardized for body surface area (BSA). A total of 403 persons developed moderate CKD using CG [incidence 8.9/1000 person-years of follow-up (PYFU); 95% confidence interval (CI) 8.0-9.8] and 364 using CKD-EPI (incidence 7.3/1000 PYFU; 95% CI 6.5-8.0). CG-derived eGFRs were equal to CKD-EPI-derived eGFRs at predicting ESRD (n = 36) and death (n = 565), as measured by the Akaike information criterion. CG-based moderate and advanced CKDs were associated with ESRD [adjusted incidence rate ratio (aIRR) 7.17; 95% CI 2.65-19.36 and aIRR 23.46; 95% CI 8.54-64.48, respectively], as were CKD-EPI-based moderate and advanced CKDs (aIRR 12.41; 95% CI 4.74-32.51 and aIRR 12.44; 95% CI 4.83-32.03, respectively). CONCLUSIONS: Differences between eGFRs using CG adjusted for BSA or CKD-EPI were modest. In the absence of a gold standard, the two formulae predicted clinical outcomes with equal precision and can be used to estimate GFR in HIV-positive persons.
Resumo:
Experimentally renal tissue hypoxia appears to play an important role in the pathogenesis of chronic kidney disease (CKD) and arterial hypertension (AHT). In this study we measured renal tissue oxygenation and its determinants in humans using blood oxygenation level-dependent magnetic resonance imaging (BOLD-MRI) under standardized hydration conditions. Four coronal slices were selected, and a multi gradient echo sequence was used to acquire T2* weighted images. The mean cortical and medullary R2* values ( = 1/T2*) were calculated before and after administration of IV furosemide, a low R2* indicating a high tissue oxygenation. We studied 195 subjects (95 CKD, 58 treated AHT, and 42 healthy controls). Mean cortical R2 and medullary R2* were not significantly different between the groups at baseline. In stimulated conditions (furosemide injection), the decrease in R2* was significantly blunted in patients with CKD and AHT. In multivariate linear regression analyses, neither cortical nor medullary R2* were associated with eGFR or blood pressure, but cortical R2* correlated positively with male gender, blood glucose and uric acid levels. In conclusion, our data show that kidney oxygenation is tightly regulated in CKD and hypertensive patients at rest. However, the metabolic response to acute changes in sodium transport is altered in CKD and in AHT, despite preserved renal function in the latter group. This suggests the presence of early renal metabolic alterations in hypertension. The correlations between cortical R2* values, male gender, glycemia and uric acid levels suggest that these factors interfere with the regulation of renal tissue oxygenation.
Resumo:
OBJECTIVES This study sought to evaluate: 1) the effect of impaired renal function on long-term clinical outcomes in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stent (DES); and 2) the safety and efficacy of new-generation compared with early-generation DES in women with chronic kidney disease (CKD). BACKGROUND The prevalence and effect of CKD in women undergoing PCI with DES is unclear. METHODS We pooled patient-level data for women enrolled in 26 randomized trials. The study population was categorized by creatinine clearance (CrCl) <45 ml/min, 45 to 59 ml/min, and ≥60 ml/min. The primary endpoint was the 3-year rate of major adverse cardiovascular events (MACE). Participants for whom baseline creatinine was missing were excluded from the analysis. RESULTS Of 4,217 women included in the pooled cohort treated with DES and for whom serum creatinine was available, 603 (14%) had a CrCl <45 ml/min, 811 (19%) had a CrCl 45 to 59 ml/min, and 2,803 (66%) had a CrCl ≥60 ml/min. A significant stepwise gradient in risk for MACE was observed with worsening renal function (26.6% vs. 15.8% vs. 12.9%; p < 0.01). Following multivariable adjustment, CrCl <45 ml/min was independently associated with a higher risk of MACE (adjusted hazard ratio: 1.56; 95% confidence interval: 1.23 to 1.98) and all-cause mortality (adjusted hazard ratio: 2.67; 95% confidence interval: 1.85 to 3.85). Compared with older-generation DES, the use of newer-generation DES was associated with a reduction in the risk of cardiac death, myocardial infarction, or stent thrombosis in women with CKD. The effect of new-generation DES on outcomes was uniform, between women with or without CKD, without evidence of interaction. CONCLUSIONS Among women undergoing PCI with DES, CKD is a common comorbidity associated with a strong and independent risk for MACE that is durable over 3 years. The benefits of newer-generation DES are uniform in women with or without CKD.
Resumo:
A variety of chronic kidney diseases tend to progress towards end-stage kidney disease. Progression is largely due to factors unrelated to the initial disease, including arterial hypertension and proteinuria. Intensive treatment of these two factors is potentially able to slow the progression of kidney disease. Blockers of the renin-angiotensin-aldosterone system, either converting enzyme inhibitors or angiotensin II receptor antagonists, reduce both blood pressure and proteinuria and appear superior to a conventional antihypertensive treatment regimen in preventing progression to end-stage kidney disease. The most recent recommendations state that in children with chronic kidney disease without proteinuria the blood pressure goal is the corresponding 75th centile for body length, age and gender; whereas the 50th centile should be aimed in children with chronic kidney disease and pathologically increased proteinuria.
Resumo:
BACKGROUND Neutrophil gelatinase-associated lipocalin (NGAL) is a protein that is used in human medicine as a real-time indicator of acute kidney injury (AKI). HYPOTHESIS Dogs with AKI have significantly higher plasma NGAL concentration and urine NGAL-to-creatinine ratio (UNCR) compared with healthy dogs and dogs with chronic kidney disease (CKD). ANIMALS 18 healthy control dogs, 17 dogs with CKD, and 48 dogs with AKI. METHODS Over a period of 1 year, all dogs with renal azotemia were prospectively included. Urine and plasma samples were collected during the first 24 hours after presentation or after development of renal azotemia. Plasma and urine NGAL concentrations were measured with a commercially available canine NGAL Elisa Kit (Bioporto® Diagnostic) and UNCR was calculated. A single-injection plasma inulin clearance was performed in the healthy dogs. RESULTS Median (range) NGAL plasma concentration in healthy dogs, dogs with CKD, and AKI were 10.7 ng/mL (2.5-21.2), 22.0 ng/mL (7.7-62.3), and 48.3 ng/mL (5.7-469.0), respectively. UNCR was 2 × 10(-8) (0-46), 1,424 × 10(-8) (385-18,347), and 2,366 × 10(-8) (36-994,669), respectively. Dogs with renal azotemia had significantly higher NGAL concentrations and UNCR than did healthy dogs (P < .0001 for both). Plasma NGAL concentration was significantly higher in dogs with AKI compared with dogs with CKD (P = .027). CONCLUSIONS AND CLINICAL IMPORTANCE Plasma NGAL could be helpful to differentiate AKI from CKD in dogs with renal azotemia.
Resumo:
OBJECTIVES HIV infection has been associated with an increased risk of chronic kidney disease (CKD). Little is known about the prevalence of CKD in individuals with high CD4 cell counts prior to initiation of antiretroviral therapy (ART). We sought to address this knowledge gap. METHODS We describe the prevalence of CKD among 4637 ART-naïve adults (mean age 36.8 years) with CD4 cell counts > 500 cells/μL at enrolment in the Strategic Timing of AntiRetroviral Treatment (START) study. CKD was defined by estimated glomerular filtration rate (eGFR) < 60 mL/min/1.73 m(2) and/or dipstick urine protein ≥ 1+. Logistic regression was used to identify baseline characteristics associated with CKD. RESULTS Among 286 [6.2%; 95% confidence interval (CI) 5.5%, 6.9%] participants with CKD, the majority had isolated proteinuria. A total of 268 participants had urine protein ≥ 1+, including 41 with urine protein ≥ 2+. Only 22 participants (0.5%) had an estimated glomerular filtration rate < 60 mL/min/1.73 m(2) , including four who also had proteinuria. Baseline characteristics independently associated with CKD included diabetes [adjusted odds ratio (aOR) 1.73; 95% CI 1.05, 2.85], hypertension (aOR 1.82; 95% CI 1.38, 2.38), and race/ethnicity (aOR 0.59; 95% CI 0.37, 0.93 for Hispanic vs. white). CONCLUSIONS We observed a low prevalence of CKD associated with traditional CKD risk factors among ART-naïve clinical trial participants with CD4 cell counts > 500 cells/μL.
Resumo:
The efficacy of mammalian target of rapamycin (mTOR) inhibitors is currently tested in patients affected by autosomal dominant polycystic kidney disease. Treatment with mTOR inhibitors has been associated with numerous side effects. However, the renal-specific effect of mTOR inhibitor treatment cessation in polycystic kidney disease is currently unknown. Therefore, we compared pulse and continuous everolimus treatment in Han:SPRD rats. Four-week-old male heterozygous polycystic and wild-type rats were administered everolimus or vehicle by gavage feeding for 5 wk, followed by 7 wk without treatment, or continuously for 12 wk. Cessation of everolimus did not result in the appearance of renal cysts up to 7 wk postwithdrawal despite the reemergence of S6 kinase activity coupled with an overall increase in cell proliferation. Pulse everolimus treatment resulted in striking noncystic renal parenchymal enlargement and glomerular hypertrophy that was not associated with compromised kidney function. Both treatment regimens ameliorated kidney function, preserved the glomerular-tubular connection, and reduced proteinuria. Pulse treatment at an early age delays cyst development but leads to striking glomerular and parenchymal hypertrophy. Our data might have an impact when long-term treatment using mTOR inhibitors in patients with autosomal dominant polycystic kidney disease is being considered.
Resumo:
In the past decade, several arm rehabilitation robots have been developed to assist neurological patients during therapy. Early devices were limited in their number of degrees of freedom and range of motion, whereas newer robots such as the ARMin robot can support the entire arm. Often, these devices are combined with virtual environments to integrate motivating game-like scenarios. Several studies have shown a positive effect of game-playing on therapy outcome by increasing motivation. In addition, we assume that practicing highly functional movements can further enhance therapy outcome by facilitating the transfer of motor abilities acquired in therapy to daily life. Therefore, we present a rehabilitation system that enables the training of activities of daily living (ADL) with the support of an assistive robot. Important ADL tasks have been identified and implemented in a virtual environment. A patient-cooperative control strategy with adaptable freedom in timing and space was developed to assist the patient during the task. The technical feasibility and usability of the system was evaluated with seven healthy subjects and three chronic stroke patients.
Resumo:
BACKGROUND The number of older adults in the global population is increasing. This demographic shift leads to an increasing prevalence of age-associated disorders, such as Alzheimer's disease and other types of dementia. With the progression of the disease, the risk for institutional care increases, which contrasts with the desire of most patients to stay in their home environment. Despite doctors' and caregivers' awareness of the patient's cognitive status, they are often uncertain about its consequences on activities of daily living (ADL). To provide effective care, they need to know how patients cope with ADL, in particular, the estimation of risks associated with the cognitive decline. The occurrence, performance, and duration of different ADL are important indicators of functional ability. The patient's ability to cope with these activities is traditionally assessed with questionnaires, which has disadvantages (eg, lack of reliability and sensitivity). Several groups have proposed sensor-based systems to recognize and quantify these activities in the patient's home. Combined with Web technology, these systems can inform caregivers about their patients in real-time (e.g., via smartphone). OBJECTIVE We hypothesize that a non-intrusive system, which does not use body-mounted sensors, video-based imaging, and microphone recordings would be better suited for use in dementia patients. Since it does not require patient's attention and compliance, such a system might be well accepted by patients. We present a passive, Web-based, non-intrusive, assistive technology system that recognizes and classifies ADL. METHODS The components of this novel assistive technology system were wireless sensors distributed in every room of the participant's home and a central computer unit (CCU). The environmental data were acquired for 20 days (per participant) and then stored and processed on the CCU. In consultation with medical experts, eight ADL were classified. RESULTS In this study, 10 healthy participants (6 women, 4 men; mean age 48.8 years; SD 20.0 years; age range 28-79 years) were included. For explorative purposes, one female Alzheimer patient (Montreal Cognitive Assessment score=23, Timed Up and Go=19.8 seconds, Trail Making Test A=84.3 seconds, Trail Making Test B=146 seconds) was measured in parallel with the healthy subjects. In total, 1317 ADL were performed by the participants, 1211 ADL were classified correctly, and 106 ADL were missed. This led to an overall sensitivity of 91.27% and a specificity of 92.52%. Each subject performed an average of 134.8 ADL (SD 75). CONCLUSIONS The non-intrusive wireless sensor system can acquire environmental data essential for the classification of activities of daily living. By analyzing retrieved data, it is possible to distinguish and assign data patterns to subjects' specific activities and to identify eight different activities in daily living. The Web-based technology allows the system to improve care and provides valuable information about the patient in real-time.
Resumo:
Left-sided spatial neglect is a common neurological syndrome following right-hemispheric stroke. The presence of spatial neglect is a powerful predictor of poor rehabilitation outcome. In one influential account of spatial neglect, interhemispheric inhibition is impaired and leads to a pathological hyperactivity in the contralesional hemisphere, resulting in a biased attentional allocation towards the right hemifield. Inhibitory transcranial magnetic stimulation can reduce the hyperactivity of the contralesional, intact hemisphere and thereby improve spatial neglect symptoms. However, it is not known whether this improvement is also relevant to the activities of daily living during spontaneous behaviour. The primary aim of the present study was to investigate whether the repeated application of continuous theta burst stimulation trains could ameliorate spatial neglect on a quantitative measure of the activities of daily living during spontaneous behaviour. We applied the Catherine Bergego Scale, a standardized observation questionnaire that can validly and reliably detect the presence and severity of spatial neglect during the activities of daily living. Eight trains of continuous theta burst stimulation were applied over two consecutive days on the contralesional, left posterior parietal cortex in patients suffering from subacute left spatial neglect, in a randomized, double-blind, sham-controlled design, which also included a control group of neglect patients without stimulation. The results showed a 37% improvement in the spontaneous everyday behaviour of the neglect patients after the repeated application of continuous theta burst stimulation. Remarkably, the improvement persisted for at least 3 weeks after stimulation. The amelioration of spatial neglect symptoms in the activities of daily living was also generally accompanied by significantly better performance in the neuropsychological tests. No significant amelioration in symptoms was observed after sham stimulation or in the control group without stimulation. These results provide Class I evidence that continuous theta burst stimulation is a viable add-on therapy in neglect rehabilitation that facilitates recovery of normal everyday behaviour.