6 resultados para SYMPTOM ASSESSMENT SCALE
Resumo:
BACKGROUND: Vascular dementia is the second most common cause of dementia affecting over seven million people worldwide, yet there are no licensed treatments. There is an urgent need for a clinical trial in this patient group. Subcortical ischaemic vascular dementia is the most common variant of vascular dementia. This randomised trial will investigate whether use of calcium channel blockade with amlodipine, a commonly used agent, can provide the first evidence-based pharmacological treatment for subcortical ischaemic vascular dementia.
METHODS/DESIGN: This is a randomised controlled trial of calcium channel blockade with Amlodipine For the treatment oF subcortical ischaEmic vasCular demenTia (AFFECT) to test the hypothesis that treatment with amlodipine can improve outcomes for these patients in a phase IIb, multi-centre, double-blind, placebo-controlled randomised trial. The primary outcome is the change from baseline to 12 months in the Vascular Dementia Assessment Scale cognitive subscale (VADAS-cog). Secondary outcomes include cognitive function, executive function, clinical global impression of change, change in blood pressure, quantitative evaluation of lesion accrual based on magnetic resonance imaging (MRI), health-related quality of life, activities of daily living, non-cognitive dementia symptoms, care-giver burden and care-giver health-related quality of life, cost-effectiveness and institutionalisation. A total of 588 patients will be randomised in a 1:1 ratio to either amlodipine or placebo, recruited from sites across the UK and enrolled in the trial for 104 weeks.
DISCUSSION: There are no treatments licensed for vascular dementia. The most common subtype is subcortical ischaemic vascular dementia (SIVD). This study is designed to investigate whether amlodipine can produce benefits compared to placebo in established SIVD. It is estimated that the numbers of people with VaD and SIVD will increase globally in the future and the results of this study should inform important treatment decisions.
Resumo:
The branched vs. isoprenoid tetraether (BIT) index is based on the relative abundance of branched tetraether lipids (brGDGTs) and the isoprenoidal GDGT crenarchaeol. In Lake Challa sediments the BIT index has been applied as a proxy for local monsoon precipitation on the assumption that the primary source of brGDGTs is soil washed in from the lake's catchment. Since then, microbial production within the water column has been identified as the primary source of brGDGTs in Lake Challa sediments, meaning that either an alternative mechanism links BIT index variation with rainfall or that the proxy's application must be reconsidered. We investigated GDGT concentrations and BIT index variation in Lake Challa sediments at a decadal resolution over the past 2200 years, in combination with GDGT time-series data from 45 monthly sediment-trap samples and a chronosequence of profundal surface sediments.
Our 2200-year geochemical record reveals high-frequency variability in GDGT concentrations, and therefore in the BIT index, superimposed on distinct lower-frequency fluctuations at multi-decadal to century timescales. These changes in BIT index are correlated with changes in the concentration of crenarchaeol but not with those of the brGDGTs. A clue for understanding the indirect link between rainfall and crenarchaeol concentration (and thus thaumarchaeotal abundance) was provided by the observation that surface sediments collected in January 2010 show a distinct shift in GDGT composition relative to sediments collected in August 2007. This shift is associated with increased bulk flux of settling mineral particles with high Ti / Al ratios during March–April 2008, reflecting an event of unusually high detrital input to Lake Challa concurrent with intense precipitation at the onset of the principal rain season that year. Although brGDGT distributions in the settling material are initially unaffected, this soil-erosion event is succeeded by a massive dry-season diatom bloom in July–September 2008 and a concurrent increase in the flux of GDGT-0. Complete absence of crenarchaeol in settling particles during the austral summer following this bloom indicates that no Thaumarchaeota bloom developed at that time. We suggest that increased nutrient availability, derived from the eroded soil washed into the lake, caused the massive bloom of diatoms and that the higher concentrations of ammonium (formed from breakdown of this algal matter) resulted in a replacement of nitrifying Thaumarchaeota, which in typical years prosper during the austral summer, by nitrifying bacteria. The decomposing dead diatoms passing through the suboxic zone of the water column probably also formed a substrate for GDGT-0-producing archaea. Hence, through a cascade of events, intensive rainfall affects thaumarchaeotal abundance, resulting in high BIT index values.
Decade-scale BIT index fluctuations in Lake Challa sediments exactly match the timing of three known episodes of prolonged regional drought within the past 250 years. Additionally, the principal trends of inferred rainfall variability over the past two millennia are consistent with the hydroclimatic history of equatorial East Africa, as has been documented from other (but less well dated) regional lake records. We therefore propose that variation in GDGT production originating from the episodic recurrence of strong soil-erosion events, when integrated over (multi-)decadal and longer timescales, generates a stable positive relationship between the sedimentary BIT index and monsoon rainfall at Lake Challa. Application of this paleoprecipitation proxy at other sites requires ascertaining the local processes which affect the productivity of crenarchaeol by Thaumarchaeota and brGDGTs.
Resumo:
With the new academic year structure encouraging more in-term assessment to replace end-of-year examinations one of the problems we face is assessing students and keeping track of individual student learning without overloading the students and staff with excessive assessment burdens.
In the School of Electronics, Electrical Engineering and Computer Science, we have constructed a system that allows students to self-assess their capability on a simple Yes/No/Don’t Know scale against fine grained learning outcomes for a module. As the term progresses students update their record as appropriately, including selecting a Learnt option to reflect improvements they have gained as part of their studies.
In the system each of the learning outcomes are linked to the relevant teaching session (lectures and labs) and to online resources that students can access at any time. Students can structure their own learning experience to their needs and preferences in order to attain the learning outcomes.
The system keeps a history of the student’s record, allowing the lecturer to observe how the students’ abilities progress over the term and to compare it to assessment results. The system also keeps of any of the resource links that student has clicked on and the related learning outcome.
The initial work is comparing the accuracy of the student self-assessments with their performance in the related questions in the traditional end-of-year examination.
Resumo:
The assessment of adolescent drinking behavior is a complex task, complicated by variability in drinking patterns, the transitory and developmental nature of the behavior and the reliance (for large scale studies) on self-report questionnaires. The Adolescent Alcohol Involvement Scale (Mayer & Filstead, 1979) is a 14-item screening tool designed to help to identify alcohol misusers or more problematic drinkers. The present study utilized a large sample (n = 4066) adolescents from Northern Ireland. Results of Confirmatory Factor Analyses and reliability estimates revealed that the 14-items share sufficient common variance that scores can be considered to be reliable and that the 14 items can be scored to provide a composite alcohol use score.
Resumo:
Setting: Psychological stress is increasingly recognised within emergency medicine, given the environmental and clinical stressors associated with the specialism. The current study assessed whether psychological distress is experienced by emergency medical staff and if so, what is the expressed need within this population? Participants: Participants included ambulance personnel, nursing staff, doctors and ancillary support staff within two Accident and Emergency (A&E) departments and twelve ambulance bases within one Trust locality in NI (N = 107). Primary and secondary outcome measures: The General Health Questionnaire (GHQ-12, Goldberg, 1972, 1978), Secondary Traumatic Stress Scale (STSS, Bride, 2004) and an assessment of need questionnaire were completed and explored using mixed method analysis. Results: Results showed elevated levels of psychological distress within each profession except ambulance service clinical support officers (CSOs). Elevated levels of secondary trauma symptomatology were also found; the highest were within some nursing grades and junior doctors. Decreased enjoyment in job over time was significantly associated with higher scores. Analysis of qualitative data identified sources of stress to include low morale. A total of 65% of participants thought that work related stressors had negatively affected their mental health. Participants explored what they felt could decrease psychological distress including improved resources and psychoeducation. Conclusion: There were elevated levels of distress and secondary traumatic stress within this population as well as an expressed level of need, on both systemic and support levels.
Resumo:
Situation Background Assessment and Recommendation (SBAR): Undergraduate Perspectives C Morgan, L Adams, J Murray, R Dunlop, IK Walsh. Ian K Walsh, Centre for Medical Education, Queen’s University Belfast, Mulhouse Building, Royal Victoria Hospital, Grosvenor Road, Belfast BT12 6DP Background and Purpose: Structured communication tools are used to improve team communication quality.1,2 The Situation Background Assessment and Recommendation (SBAR) tool is widely adopted within patient safety.3 SBAR effectiveness is reportedly equivocal, suggesting use is not sustained beyond initial training.4-6 Understanding perspectives of those using SBAR may further improve clinical communication. We investigated senior medical undergraduate perspectives on SBAR, particularly when communicating with senior colleagues. Methodology: Mixed methods data collection was used. A previously piloted questionnaire with 12 five point Lickert scale questions and 3 open questions was given to all final year medical students. A subgroup also participated in 10 focus groups, deploying strictly structured audio-recorded questions. Selection was by convenience sampling, data gathered by open text questions and comments transcribed verbatim. In-vivo coding (iterative, towards data saturation) preceded thematic analysis. Results: 233 of 255 students (91%) completed the survey. 1. There were clearly contradictory viewpoints on SBAR usage. A recurrent theme was a desire for formal feedback and a relative lack of practice/experience with SBAR. 2. Students reported SBAR as having variable interpretation between individuals; limiting use as a shared mental model. 3. Brief training sessions are insufficient to embed the tool. 4. Most students reported SBAR helping effective communication, especially by providing structure in stressful situations. 5. Only 18.5% of students felt an alternative resource might be needed. Sub analysis of the themes highlighted: A. Lack of clarity regarding what information to include and information placement within the acronym, B. Senior colleague negative response to SBAR C. Lack of conciseness with the tool. Discussion and Conclusions: Despite a wide range of contradictory interpretation of SBAR utility, most students wish to retain the resource. More practice opportunities/feedback may enhance user confidence and understanding. References: (1) Leonard M, Graham S, Bonacum D. The human factor: the critical importance of effective teamwork and communication in providing safe care. Quality & Safety in Health Care 2004 Oct;13(Suppl 1):85-90. (2) d'Agincourt-Canning LG, Kissoon N, Singal M, Pitfield AF. Culture, communication and safety: lessons from the airline industry. Indian J Pediatr 2011 Jun;78(6):703-708. (3) Dunsford J. Structured communication: improving patient safety with SBAR. Nurs Womens Health 2009 Oct;13(5):384-390. (4) Compton J, Copeland K, Flanders S, Cassity C, Spetman M, Xiao Y, et al. Implementing SBAR across a large multihospital health system. Jt Comm J Qual Patient Saf 2012 Jun;38(6):261-268. (5) Ludikhuize J, de Jonge E, Goossens A. Measuring adherence among nurses one year after training in applying the Modified Early Warning Score and Situation-Background-Assessment-Recommendation instruments. Resuscitation 2011 Nov;82(11):1428-1433. (6) Cunningham NJ, Weiland TJ, van Dijk J, Paddle P, Shilkofski N, Cunningham NY. Telephone referrals by junior doctors: a randomised controlled trial assessing the impact of SBAR in a simulated setting. Postgrad Med J 2012 Nov;88(1045):619-626.