211 resultados para Chronic Nephropathy
Resumo:
We show that imatinib, nilotinib, and dasatinib possess weak off-target activity against RAF and, therefore, drive paradoxical activation of BRAF and CRAF in a RAS-dependent manner. Critically, because RAS is activated by BCR-ABL, in drug-resistant chronic myeloid leukemia (CML) cells, RAS activity persists in the presence of these drugs, driving paradoxical activation of BRAF, CRAF, MEK, and ERK, and leading to an unexpected dependency on the pathway. Consequently, nilotinib synergizes with MEK inhibitors to kill drug-resistant CML cells and block tumor growth in mice. Thus, we show that imatinib, nilotinib, and dasatinib drive paradoxical RAF/MEK/ERK pathway activation and have uncovered a synthetic lethal interaction that can be used to kill drug-resistant CML cells in vitro and in vivo.
Resumo:
ABL inhibitors have revolutionized the clinical management of chronic myeloid leukemia, but the BCR-ABLT315I mutation confers resistance to currently approved drugs. Chan et al. show, in this issue of Cancer Cell, that " switch-control" inhibitors block BCR-ABLT315I activity by preventing ABL from switching from the inactive to active conformation.
Resumo:
Mismatch negativity (MMN) is a component of the event-related potential elicited by deviant auditory stimuli. It is presumed to index pre-attentive monitoring of changes in the auditory environment. MMN amplitude is smaller in groups of individuals with schizophrenia compared to healthy controls. We compared duration-deviant MMN in 16 recent-onset and 19 chronic schizophrenia patients versus age- and sex-matched controls. Reduced frontal MMN was found in both patient groups, involved reduced hemispheric asymmetry, and was correlated with Global Assessment of Functioning (GAF) and negative symptom ratings. A cortically-constrained LORETA analysis, incorporating anatomical data from each individual's MRI, was performed to generate a current source density model of the MMN response over time. This model suggested MMN generation within a temporal, parietal and frontal network, which was right hemisphere dominant only in controls. An exploratory analysis revealed reduced CSD in patients in superior and middle temporal cortex, inferior and superior parietal cortex, precuneus, anterior cingulate, and superior and middle frontal cortex. A region of interest (ROI) analysis was performed. For the early phase of the MMN, patients had reduced bilateral temporal and parietal response and no lateralisation in frontal ROIs. For late MMN, patients had reduced bilateral parietal response and no lateralisation in temporal ROIs. In patients, correlations revealed a link between GAF and the MMN response in parietal cortex. In controls, the frontal response onset was 17 ms later than the temporal and parietal response. In patients, onset latency of the MMN response was delayed in secondary, but not primary, auditory cortex. However amplitude reductions were observed in both primary and secondary auditory cortex. These latency delays may indicate relatively intact information processing upstream of the primary auditory cortex, but impaired primary auditory cortex or cortico-cortical or thalamo-cortical communication with higher auditory cortices as a core deficit in schizophrenia.
Resumo:
Background. Escherichia coli O25b:H4-ST131 represents a predominant clone of multidrug-resistant uropathogens currently circulating worldwide in hospitals and the community. Urinary tract infections (UTIs) caused by E. coli ST131 are typically associated with limited treatment options and are often recurrent. Methods. Using established mouse models of acute and chronic UTI, we mapped the pathogenic trajectory of the reference E. coli ST131 UTI isolate, strain EC958. Results. We demonstrated that E. coli EC958 can invade bladder epithelial cells and form intracellular bacterial communities early during acute UTI. Moreover, E. coli EC958 persisted in the bladder and established chronic UTI. Prophylactic antibiotic administration failed to prevent E. coli EC958–mediated UTI. However, 1 oral dose of a small-molecular-weight compound that inhibits FimH, the type 1 fimbriae adhesin, significantly reduced bacterial colonization of the bladder and prevented acute UTI. Treatment of chronically infected mice with the same FimH inhibitor lowered their bladder bacterial burden by >1000-fold. Conclusions. In this study, we provide novel insight into the pathogenic mechanisms used by the globally disseminated E. coli ST131 clone during acute and chronic UTI and establish the potential of FimH inhibitors as an alternative treatment against multidrug-resistant E. coli.
Resumo:
The chlamydiae are obligate intracellular parasites that have evolved specific interactions with their various hosts and host cell types to ensure their successful survival and consequential pathogenesis. The species Chlamydia pneumoniae is ubiquitous, with serological studies showing that most humans are infected at some stage in their lifetime. While most human infections are asymptomatic, C. pneumoniae can cause more-severe respiratory disease and pneumonia and has been linked to chronic diseases such as asthma, atherosclerosis, and even Alzheimer's disease. The widely dispersed animal-adapted C. pneumoniae strains cause an equally wide range of diseases in their hosts. It is emerging that the ability of C. pneumoniae to survive inside its target cells, including evasion of the host's immune attack mechanisms, is linked to the acquisition of key metabolites. Tryptophan and arginine are key checkpoint compounds in this host-parasite battle. Interestingly, the animal strains of C. pneumoniae have a slightly larger genome, enabling them to cope better with metabolite restrictions. It therefore appears that as the evolutionarily more ancient animal strains have evolved to infect humans, they have selectively become more "susceptible" to the levels of key metabolites, such as tryptophan. While this might initially appear to be a weakness, it allows these human C. pneumoniae strains to exquisitely sense host immune attack and respond by rapidly reverting to a persistent phase. During persistence, they reduce their metabolic levels, halting progression of their developmental cycle, waiting until the hostile external conditions have passed before they reemerge.
Resumo:
Background Chronic kidney disease (CKD) is a complex health problem, which requires individuals to invest considerable time and energy in managing their health and adhering to multifaceted treatment regimens. Objectives To review studies delivering self-management interventions to people with CKD (Stages 1–4) and assess whether these interventions improve patient outcomes. Design: Systematic review. Methods Nine electronic databases (MedLine, CINAHL, EMBASE, ProQuest Health & Medical Complete, ProQuest Nursing & Allied Health, The Cochrane Library, The Joanna Briggs Institute EBP Database, Web of Science and PsycINFO) were searched using relevant terms for papers published between January 2003 and February 2013. Results The search strategy identified 2,051 papers, of which 34 were retrieved in full with only 5 studies involving 274 patients meeting the inclusion criteria. Three studies were randomised controlled trials, a variety of methods were used to measure outcomes, and four studies included a nurse on the self-management intervention team. There was little consistency in the delivery, intensity, duration and format of the self-management programmes. There is some evidence that knowledge- and health-related quality of life improved. Generally, small effects were observed for levels of adherence and progression of CKD according to physiologic measures. Conclusion The effectiveness of self-management programmes in CKD (Stages 1–4) cannot be conclusively ascertained, and further research is required. It is desirable that individuals with CKD are supported to effectively self-manage day-to-day aspects of their health.
Resumo:
The advances in modern information and communication (ICT) technology continue to address the challenges and improve` health outcomes for the survivors of chronic disease such as prostate cancer. The management of survivorship is increasingly becoming an important need for the survivors to manage their chronic conditions. The technology interventions such as tele-health as well as self-managed technology applications have shown a potential to improve survivorship outcomes. However, the application of these tools should be supported by strong health economics evidence. This work discusses the challenges of technology led survivorship care models and presents an integrated approach to address these challenges.
Resumo:
Chronic kidney disease (CKD) in ageing is a burden on health systems worldwide. Rat models of age-related CKD linked with obesity and hypertension were used to investigate alterations in oxidant handling and energy metabolism to identify gene targets or markers for age-related CKD. Young adult (3 months) and old (21–24 months) spontaneously-hypertensive (SHR), normotensive Wistar-Kyoto (WKY) and Wistar rats (normotensive, obese in ageing) were compared for renal functional and physiological parameters, renal fibrosis and inflammation, oxidative stress (hemeoxygenase-1/HO-1), apoptosis and cell injury (including Bax:Bcl-2), phosphorylated and non-phosphorylated forms of oxidant and energy sensing proteins (p66Shc, AMPK), signal transduction proteins (ERK1/2, PKB), and transcription factors (NF-κB, FoxO1). All old rats were normoglycemic. Renal fibrosis, tubular epithelial apoptosis, interstitial macrophages and myofibroblasts (all p < 0.05), p66Shc/phospho-p66 (p < 0.05), Bax/Bcl-2 ratio (p < 0.05) and NF-κB expression (p < 0.01) were highest in old obese Wistars. Expression of phospho-FoxO/FoxO was elevated in old Wistars (p < 0.001) and WKYs (p < 0.01). SHRs had high levels in young and old rats. Expression of PKB, phospho-PKB, ERK1/2 and phospho-ERK1/2 were significantly elevated in all aged animals. These results suggest that obesity and hypertension have differing oxidant handling and signalling pathways that act in the pathogenesis of age-related CKD
Resumo:
The prevalence of leg ulcers of is 0.12%–1.1% and >3,000 lower limb amputations are performed yearly in Australia due to non-healing leg or foot ulcers. Although evidence on leg ulcer management is available, a significant evidence-practice gap exists. To identify current leg ulcer management, a cross-sectional retrospective study was undertaken in Brisbane, Australia. A sample of 104 clients was recruited from a community specialist wound clinic and a tertiary hospital outpatient’s specialist wound clinic. All clients had an ulcer below their knee or on their foot for ≥4 weeks. Data were collected on ulcer care, health service usage and clinical history for the year prior to admission. On admission, participants reported having their ulcer for a median of 25 weeks (range 2-728 weeks); with 51% (53/104) reporting an ulcer duration of ≥24 weeks. Including the wound clinic, participants sought ulcer care from a median of 3 health care providers (range 2-7). General Practitioners provided ulcer care to 82% of participants. Nearly half (42%) had self-cared for their ulcer; 29% (30/104) received treatment by a community nurse. A gap was found between the community-based ulcer care experienced by this population and evidence-based guidelines in regards to assessment, management, advice, and referrals.
Resumo:
Rationale Nutritional support is effective in managing malnutrition in COPD (Collins et al., 2012) leading to functional improvements (Collins et al., 2013). However, comparative trials of first line interventions are lacking. This randomised trial compared the effectiveness of individualised dietary advice by a dietitian (DA) versus oral nutritional supplements (ONS). Methods A target sample of 200 stable COPD outpatients at risk of malnutrition (‘MUST’; medium + high risk) were randomised to either a 12-week intervention of ONS (ONS: ~400 kcal/d, ~40 g/d protein) or DA with supportive written advice. The primary outcome was quality of life (QoL) measured using St George’s Respiratory Questionnaire with secondary outcomes including handgrip strength, body weight and nutritional intake. Both the change from baseline and the differences between groups was analysed using SPSS version 20. Results 84 outpatients were recruited (ONS: 41 vs. DA: 43), 72 completed the intervention (ONS: 33 vs. DA: 39). Mean BMI was 18.2 SD 1.6 kg/m2, age 72.6 SD 10 years, FEV1% predicted 36 SD 15% (severe COPD). In comparison to the DA group, the ONS group experienced significantly greater improvements in protein intakes above baseline values at both week 6 (+21.0 SEM 4.3 g/d vs. +0.52 SEM 4.3 g/d; p < 0.001) and week 12 (+19.0 SEM 5.0 g/d vs. +1.0 SEM 3.6 g/d; p = 0.033;ANOVA). QoL and secondary outcomes remained stable at 12 weeks in both groups with slight improvements in the ONS group but no differences between groups. Conclusion In outpatients at risk of malnutrition with severe COPD, nutritional support involving either ONS or DA appears to maintain in tritional status, functional capacity and QoL. However, larger trials, and earlier, multi-modal nutritional interventions for an extended duration should be explored.
Resumo:
The evidence for nutritional support in COPD is almost entirely based on oral nutritional supplements (ONS) yet despite this dietary counseling and food fortification (DA) are often used as the first line treatment for malnutrition. This study aimed to investigate the effectiveness of ONS vs. DA in improving nutritional intake in malnourished outpatients with COPD. 70 outpatients (BMI 18.4 SD 1.6 kg/m2, age 73 SD 9 years, severe COPD) were randomised to receive a 12-week intervention of either ONS or DA (n 33 ONS vs. n 37 DA). Paired t-test analysis revealed total energy intakes significantly increased with ONS at week 6 (+302 SD 537 kcal/d; p = 0.002), with a slight reduction at week 12 (+243 SD 718 kcal/d; p = 0.061) returning to baseline levels on stopping supplementation. DA resulted in small increases in energy that only reached significance 3 months post-intervention (week 6: +48 SD 623 kcal/d, p = 0.640; week 12: +157 SD 637 kcal/d, p = 0.139; week 26: +247 SD 592 kcal/d, p = 0.032). Protein intake was significantly higher in the ONS group at both week 6 and 12 (ONS: +19.0 SD 25.0 g/d vs. DA: +1.0 SD 13.0 g/d; p = 0.033 ANOVA) but no differences were found at week 26. Vitamin C, Iron and Zinc intakes significantly increased only in the ONS group. ONS significantly increased energy, protein and several micronutrient intakes in malnourished COPD patients but only during the period of supplementation. Trials investigating the effects of combined nutritional interventions are required.
Resumo:
Deprivation has previously been shown to be an independent risk factor for the high prevalence of malnutrition observed in COPD (Collins et al., 2010). It has been suggested the socioeconomic gradient observed in COPD is greater than any other chronic disease (Prescott & Vestbo, 1999). The current study aimed to examine the infl uence of disease severity and social deprivation on malnutrition risk in outpatients with COPD. 424 COPD outpatients were screened using the ‘Malnutrition Universal Screening Tool’ (‘MUST’). COPD disease severity was recorded in accordance with the GOLD criteria and deprivation was established according to the patient’s geographical location (postcode) at the time of nutritional screening using the UK Government’s Index of Multiple Deprivation (IMD). IMD ranks postcodes from 1 (most deprived) to 32,482 (least deprived). Disease severity was posi tively associated with an increased prevalence of malnutrition risk (p < 0.001) both within and between groups, whilst rank IMD was negatively associated with malnutrition (p = 0.020), i.e. those residing in less deprived areas were less likely to be malnourished. Within each category of disease severity the prevalence of malnutrition was two-fold greater in those residing in the most deprived areas compared to those residing in the least deprived areas. This study suggests that deprivation and disease severity are independent risk factors for malnutrition in COPD both contributing to the widely variable prevalence of malnutrition. Consideration of these issues could assist with the targeted nutritional management of these patients.
Resumo:
SETTING National household survey of adults in South Africa, a middle income country. OBJECTIVE To determine the prevalence and predictors of chronic bronchitis. DESIGN A stratified national probability sample of households was selected. All adults in the selected households were interviewed. Chronic bronchitis was defined as chronic productive cough. Socio-demographic predictors were wealth, education, race, age and urban residence. Personal and exposure variables included history of tuberculosis, domestic exposure to smoky fuels, occupational exposures, smoking and body mass index. RESULTS The overall prevalence of chronic bronchitis was 2.3% in men and 2.8% in women. The strongest predictor of chronic bronchitis was a history of tuberculosis (men, odds ratio [OR] 4.9; 95% confidence interval [CI] 2.6-9.2; women, OR 6.6; 95% CI 3.7-11.9). Other risk factors were smoking, occupational exposure (in men), domestic exposure to smoky fuel (in women) and (in univariate analysis only) being underweight. Wealth and particularly education were protective. CONCLUSION The pattern of chronic bronchitis in South Africa suggests a combination of risk factors that includes not only smoking but also tuberculosis, occupational exposures in men and domestic fuel exposure in women. Control of these risk factors requires public health action across a broad front. The protective role of education requires elucidation.