11 resultados para INSUFFICIENCY
em Duke University
Resumo:
Oxidative stress is a deleterious stressor associated with a plethora of disease and aging manifestations, including neurodegenerative disorders, yet very few factors and mechanisms promoting the neuroprotection of photoreceptor and other neurons against oxidative stress are known. Insufficiency of RAN-binding protein-2 (RANBP2), a large, mosaic protein with pleiotropic functions, suppresses apoptosis of photoreceptor neurons upon aging and light-elicited oxidative stress, and promotes age-dependent tumorigenesis by mechanisms that are not well understood. Here we show that, by downregulating selective partners of RANBP2, such as RAN GTPase, UBC9 and ErbB-2 (HER2; Neu), and blunting the upregulation of a set of orphan nuclear receptors and the light-dependent accumulation of ubiquitylated substrates, light-elicited oxidative stress and Ranbp2 haploinsufficiency have a selective effect on protein homeostasis in the retina. Among the nuclear orphan receptors affected by insufficiency of RANBP2, we identified an isoform of COUP-TFI (Nr2f1) as the only receptor stably co-associating in vivo with RANBP2 and distinct isoforms of UBC9. Strikingly, most changes in proteostasis caused by insufficiency of RANBP2 in the retina are not observed in the supporting tissue, the retinal pigment epithelium (RPE). Instead, insufficiency of RANBP2 in the RPE prominently suppresses the light-dependent accumulation of lipophilic deposits, and it has divergent effects on the accumulation of free cholesterol and free fatty acids despite the genotype-independent increase of light-elicited oxidative stress in this tissue. Thus, the data indicate that insufficiency of RANBP2 results in the cell-type-dependent downregulation of protein and lipid homeostasis, acting on functionally interconnected pathways in response to oxidative stress. These results provide a rationale for the neuroprotection from light damage of photosensory neurons by RANBP2 insufficiency and for the identification of novel therapeutic targets and approaches promoting neuroprotection.
Resumo:
The use of stem cells for tissue regeneration and repair is advancing both at the bench and bedside. Stem cells isolated from bone marrow are currently being tested for their therapeutic potential in a variety of clinical conditions including cardiovascular injury, kidney failure, cancer, and neurological and bone disorders. Despite the advantages, stem cell therapy is still limited by low survival, engraftment, and homing to damage area as well as inefficiencies in differentiating into fully functional tissues. Genetic engineering of mesenchymal stem cells is being explored as a means to circumvent some of these problems. This review presents the current understanding of the use of genetically engineered mesenchymal stem cells in human disease therapy with emphasis on genetic modifications aimed to improve survival, homing, angiogenesis, and heart function after myocardial infarction. Advancements in other disease areas are also discussed.
Resumo:
BACKGROUND: Invasive aspergillosis (IA) is an important cause of morbidity and mortality in hematopoietic stem cell transplant (HSCT) and solid organ transplant (SOT) recipients. The purpose of this study was to evaluate factors associated with mortality in transplant patients with IA. METHODS: Transplant patients from 23 US centers were enrolled from March 2001 to October 2005 as part of the Transplant Associated Infection Surveillance Network. IA cases were identified prospectively in this cohort through March 2006, and data were collected. Factors associated with 12-week all-cause mortality were determined by logistic regression analysis and Cox proportional hazards regression. RESULTS: Six-hundred forty-two cases of proven or probable IA were evaluated, of which 317 (49.4%) died by the study endpoint. All-cause mortality was greater in HSCT patients (239 [57.5%] of 415) than in SOT patients (78 [34.4%] of 227; P<.001). Independent poor prognostic factors in HSCT patients were neutropenia, renal insufficiency, hepatic insufficiency, early-onset IA, proven IA, and methylprednisolone use. In contrast, white race was associated with decreased risk of death. Among SOT patients, hepatic insufficiency, malnutrition, and central nervous system disease were poor prognostic indicators, whereas prednisone use was associated with decreased risk of death. Among HSCT or SOT patients who received antifungal therapy, use of an amphotericin B preparation as part of initial therapy was associated with increased risk of death. CONCLUSIONS: There are multiple variables associated with survival in transplant patients with IA. Understanding these prognostic factors may assist in the development of treatment algorithms and clinical trials.
Resumo:
The autosomal recessive kidney disease nephronophthisis (NPHP) constitutes the most frequent genetic cause of terminal renal failure in the first 3 decades of life. Ten causative genes (NPHP1-NPHP9 and NPHP11), whose products localize to the primary cilia-centrosome complex, support the unifying concept that cystic kidney diseases are "ciliopathies". Using genome-wide homozygosity mapping, we report here what we believe to be a new locus (NPHP-like 1 [NPHPL1]) for an NPHP-like nephropathy. In 2 families with an NPHP-like phenotype, we detected homozygous frameshift and splice-site mutations, respectively, in the X-prolyl aminopeptidase 3 (XPNPEP3) gene. In contrast to all known NPHP proteins, XPNPEP3 localizes to mitochondria of renal cells. However, in vivo analyses also revealed a likely cilia-related function; suppression of zebrafish xpnpep3 phenocopied the developmental phenotypes of ciliopathy morphants, and this effect was rescued by human XPNPEP3 that was devoid of a mitochondrial localization signal. Consistent with a role for XPNPEP3 in ciliary function, several ciliary cystogenic proteins were found to be XPNPEP3 substrates, for which resistance to N-terminal proline cleavage resulted in attenuated protein function in vivo in zebrafish. Our data highlight an emerging link between mitochondria and ciliary dysfunction, and suggest that further understanding the enzymatic activity and substrates of XPNPEP3 will illuminate novel cystogenic pathways.
Resumo:
BACKGROUND: There have been major changes in the management of anemia in US hemodialysis patients in recent years. We sought to determine the influence of clinical trial results, safety regulations, and changes in reimbursement policy on practice. METHODS: We examined indicators of anemia management among incident and prevalent hemodialysis patients from a medium-sized dialysis provider over three time periods: (1) 2004 to 2006 (2) 2007 to 2009, and (3) 2010. Trends across the three time periods were compared using generalized estimating equations. RESULTS: Prior to 2007, the median proportion of patients with monthly hemoglobin >12 g/dL for patients on dialysis 0 to 3, 4 to 6 and 7 to 18 months, respectively, was 42%, 55% and 46% declined to 41%, 54%, and 40% after 2007, and declined more sharply in 2010 to 34%, 41%, and 30%. Median weekly Epoeitin alpha doses over the same periods were 18,000, 12,400, and 9,100 units before 2007; remained relatively unchanged from 2007 to 2009; and decreased sharply in the patients 3-6 and 6-18 months on dialysis to 10,200 and 7,800 units, respectively in 2010. Iron doses, serum ferritin, and transferrin saturation levels increased over time with more pronounced increases in 2010. CONCLUSION: Modest changes in anemia management occurred between 2007 and 2009, followed by more dramatic changes in 2010. Studies are needed to examine the effects of declining erythropoietin use and hemoglobin levels and increasing intravenous iron use on quality of life, transplantation rates, infection rates and survival.
Resumo:
Systematic reviews comparing the effectiveness of strategies to prevent, detect, and treat chronic kidney disease are needed to inform patient care. We engaged stakeholders in the chronic kidney disease community to prioritize topics for future comparative effectiveness research systematic reviews. We developed a preliminary list of suggested topics and stakeholders refined and ranked topics based on their importance. Among 46 topics identified, stakeholders nominated 18 as 'high' priority. Most pertained to strategies to slow disease progression, including: (a) treat proteinuria, (b) improve access to care, (c) treat hypertension, (d) use health information technology, and (e) implement dietary strategies. Most (15 of 18) topics had been previously studied with two or more randomized controlled trials, indicating feasibility of rigorous systematic reviews. Chronic kidney disease topics rated by stakeholders as 'high priority' are varied in scope and may lead to quality systematic reviews impacting practice and policy.
Resumo:
BACKGROUND: Several observational studies have evaluated the effect of a single exposure window with blood pressure (BP) medications on outcomes in incident dialysis patients, but whether BP medication prescription patterns remain stable or a single exposure window design is adequate to evaluate effect on outcomes is unclear. METHODS: We described patterns of BP medication prescription over 6 months after dialysis initiation in hemodialysis and peritoneal dialysis patients, stratified by cardiovascular comorbidity, diabetes, and other patient characteristics. The cohort included 13,072 adult patients (12,159 hemodialysis, 913 peritoneal dialysis) who initiated dialysis in Dialysis Clinic, Inc., facilities January 1, 2003-June 30, 2008, and remained on the original modality for at least 6 months. We evaluated monthly patterns in BP medication prescription over 6 months and at 12 and 24 months after initiation. RESULTS: Prescription patterns varied by dialysis modality over the first 6 months; substantial proportions of patients with prescriptions for beta-blockers, renin angiotensin system agents, and dihydropyridine calcium channel blockers in month 6 no longer had prescriptions for these medications by month 24. Prescription of specific medication classes varied by comorbidity, race/ethnicity, and age, but little by sex. The mean number of medications was 2.5 at month 6 in hemodialysis and peritoneal dialysis cohorts. CONCLUSIONS: This study evaluates BP medication patterns in both hemodialysis and peritoneal dialysis patients over the first 6 months of dialysis. Our findings highlight the challenges of assessing comparative effectiveness of a single BP medication class in dialysis patients. Longitudinal designs should be used to account for changes in BP medication management over time, and designs that incorporate common combinations should be considered.
Resumo:
BACKGROUND: Early preparation for renal replacement therapy (RRT) is recommended for patients with advanced chronic kidney disease (CKD), yet many patients initiate RRT urgently and/or are inadequately prepared. METHODS: We conducted audio-recorded, qualitative, directed telephone interviews of nephrology health care providers (n = 10, nephrologists, physician assistants, and nurses) and primary care physicians (PCPs, n = 4) to identify modifiable challenges to optimal RRT preparation to inform future interventions. We recruited providers from public safety-net hospital-based and community-based nephrology and primary care practices. We asked providers open-ended questions to assess their perceived challenges and their views on the role of PCPs and nephrologist-PCP collaboration in patients' RRT preparation. Two independent and trained abstractors coded transcribed audio-recorded interviews and identified major themes. RESULTS: Nephrology providers identified several factors contributing to patients' suboptimal RRT preparation, including health system resources (e.g., limited time for preparation, referral process delays, and poorly integrated nephrology and primary care), provider skills (e.g., their difficulty explaining CKD to patients), and patient attitudes and cultural differences (e.g., their poor understanding and acceptance of their CKD and its treatment options, their low perceived urgency for RRT preparation; their negative perceptions about RRT, lack of trust, or language differences). PCPs desired more involvement in preparation to ensure RRT transitions could be as "smooth as possible", including providing patients with emotional support, helping patients weigh RRT options, and affirming nephrologist recommendations. Both nephrology providers and PCPs desired improved collaboration, including better information exchange and delineation of roles during the RRT preparation process. CONCLUSIONS: Nephrology and primary care providers identified health system resources, provider skills, and patient attitudes and cultural differences as challenges to patients' optimal RRT preparation. Interventions to improve these factors may improve patients' preparation and initiation of optimal RRTs.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
The Dietary Approaches to Stop Hypertension (DASH) trial showed that a diet rich in fruits, vegetables, low-fat dairy products with reduced total and saturated fat, cholesterol, and sugar-sweetened products effectively lowers blood pressure in individuals with prehypertension and stage I hypertension. Limited evidence is available on the safety and efficacy of the DASH eating pattern in special patient populations that were excluded from the trial. Caution should be exercised before initiating the DASH diet in patients with chronic kidney disease, chronic liver disease, and those who are prescribed renin-angiotensin-aldosterone system antagonist, but these conditions are not strict contraindications to DASH. Modifications to the DASH diet may be necessary to facilitate its use in patients with chronic heart failure, uncontrolled diabetes mellitus type II, lactose intolerance, and celiac disease. In general, the DASH diet can be adopted by most patient populations and initiated simultaneously with medication therapy and other lifestyle interventions.
Resumo:
To identify patients at increased risk of cardiovascular (CV) outcomes, apparent treatment-resistant hypertension (aTRH) is defined as having a blood pressure above goal despite the use of 3 or more antihypertensive therapies of different classes at maximally tolerated doses, ideally including a diuretic. Recent epidemiologic studies in selected populations estimated the prevalence of aTRH as 10% to 15% among patients with hypertension and that aTRH is associated with elevated risk of CV and renal outcomes. Additionally, aTRH and CKD are associated. Although the pathogenesis of aTRH is multifactorial, the kidney is believed to play a significant role. Increased volume expansion, aldosterone concentration, mineralocorticoid receptor activity, arterial stiffness, and sympathetic nervous system activity are central to the pathogenesis of aTRH and are targets of therapies. Although diuretics form the basis of therapy in aTRH, pathophysiologic and clinical data suggest an important role for aldosterone antagonism. Interventional techniques, such as renal denervation and carotid baroreceptor activation, modulate the sympathetic nervous system and are currently in phase III trials for the treatment of aTRH. These technologies are as yet unproven and have not been investigated in relationship to CV outcomes or in patients with CKD.