191 resultados para Active older adults
Resumo:
In 2007, a comprehensive review of the extant research on nonpharmacological interventions for persons with early-stage dementia was conducted. More than 150 research reports, centered on six major domains, were included: early-stage support groups, cognitive training and enhancement programs, exercise programs, exemplar programs, health promotion programs, and “other” programs not fitting into previous categories. Theories of neural regeneration and plasticity were most often used to support the tested interventions. Recommendations for practice, research, and health policy are outlined, including evidence-based, nonpharmacological treatment protocols for persons with mild cognitive impairment and early-stage dementia. A tested, community-based, multimodal treatment program is also described. Overall, findings identify well-supported nonpharmacological treatments for persons with early-stage dementia and implications for a national health care agenda to optimize outcomes for this growing population of older adults.
Resumo:
Objective: To investigate the acute effects of isolated eccentric and concentric calf muscle exercise on Achilles tendon sagittal thickness. ---------- Design: Within-subject, counterbalanced, mixed design. ---------- Setting: Institutional. ---------- Participants: 11 healthy, recreationally active male adults. ---------- Interventions: Participants performed an exercise protocol, which involved isolated eccentric loading of the Achilles tendon of a single limb and isolated concentric loading of the contralateral, both with the addition of 20% bodyweight. ---------- Main outcome measurements: Sagittal sonograms were acquired prior to, immediately following and 3, 6, 12 and 24 h after exercise. Tendon thickness was measured 2 cm proximal to the superior aspect of the calcaneus. ---------- Results: Both loading conditions resulted in an immediate decrease in normalised Achilles tendon thickness. Eccentric loading induced a significantly greater decrease than concentric loading despite a similar impulse (−0.21 vs −0.05, p<0.05). Post-exercise, eccentrically loaded tendons recovered exponentially, with a recovery time constant of 2.5 h. The same exponential function did not adequately model changes in tendon thickness resulting from concentric loading. Even so, recovery pathways subsequent to the 3 h time point were comparable. Regardless of the exercise protocol, full tendon thickness recovery was not observed until 24 h. ---------- Conclusions: Eccentric loading invokes a greater reduction in Achilles tendon thickness immediately after exercise but appears to recover fully in a similar time frame to concentric loading.
Resumo:
Vitamin D, along with calcium, may help decrease the risk of falls and fractures in older adults. Sunlight and other sources of ultraviolet radiation are not recommended because they increase the risk of skin cancers and sun-induced eye disorders. Rather, vitamin D and calcium needs should be met through foods and dietary supplements. As a preventive measure to reduce the risk of falls and fractures, it is recommended that older adults meet the 2005 Dietary Guidelines and consume 1000 IU of vitamin D, preferably as vitamin D3.
Resumo:
Hazard perception in driving involves a number of different processes. This paper reports the development of two measures designed to separate these processes. A Hazard Perception Test was developed to measure how quickly drivers could anticipate hazards overall, incorporating detection, trajectory prediction, and hazard classification judgements. A Hazard Change Detection Task was developed to measure how quickly drivers can detect a hazard in a static image regardless of whether they consider it hazardous or not. For the Hazard Perception Test, young novices were slower than mid-age experienced drivers, consistent with differences in crash risk, and test performance correlated with scores in pre-existing Hazard Perception Tests. For drivers aged 65 and over, scores on the Hazard Perception Test declined with age and correlated with both contrast sensitivity and a Useful Field of View measure. For the Hazard Change Detection Task, novices responded quicker than the experienced drivers, contrary to crash risk trends, and test performance did not correlate with measures of overall hazard perception. However for drivers aged 65 and over, test performance declined with age and correlated with both hazard perception and Useful Field of View. Overall we concluded that there was support for the validity of the Hazard Perception Test for all ages but the Hazard Change Detection Task might only be appropriate for use with older drivers.
Resumo:
Comorbid depression and anxiety in late life present challenges for geriatric mental health care providers. These challenges include identifying the often complex diagnostic presentations both clinically and in a research context. This potent comorbidity can be conceived as double jeopardy in older adults, further diminishing their quality of life. Geriatric health care providers need to understand psychiatric comorbidity of this type for accurate diagnosis and early referral to specialists, and to coordinate interdisciplinary care. Researchers in the field also need to recognize potential multiple impacts of comorbidities with respect to assessment and treatment domains. This article describes the prevalence of late-life depression and anxiety disorders and reviews studies on this comorbidity in older adults. Risk factors and protective factors for anxiety and depression in later life are reviewed, and information is provided about comparative symptoms, the selection of assessment tools, and challenges to the provision of interdisciplinary, evidence-based care.
Resumo:
Background: Clinical practice and clinical research has made a concerted effort to move beyond the use of clinical indicators alone and embrace patient focused care through the use of patient reported outcomes such as healthrelated quality of life. However, unless patients give consistent consideration to the health states that give meaning to measurement scales used to evaluate these constructs, longitudinal comparison of these measures may be invalid. This study aimed to investigate whether patients give consideration to a standard health state rating scale (EQ-VAS) and whether consideration of good and poor health state descriptors immediately changes their selfreport. Methods: A randomised crossover trial was implemented amongst hospitalised older adults (n = 151). Patients were asked to consider descriptions of extremely good (Description-A) and poor (Description-B) health states. The EQ-VAS was administered as a self-report at baseline, after the first descriptors (A or B), then again after the remaining descriptors (B or A respectively). At baseline patients were also asked if they had considered either EQVAS anchors. Results: Overall 106/151 (70%) participants changed their self-evaluation by ≥5 points on the 100 point VAS, with a mean (SD) change of +4.5 (12) points (p < 0.001). A total of 74/151 (49%) participants did not consider the best health VAS anchor, of the 77 who did 59 (77%) thought the good health descriptors were more extreme (better) then they had previously considered. Similarly 85/151 (66%) participants did not consider the worst health anchor of the 66 who did 63 (95%) thought the poor health descriptors were more extreme (worse) then they had previously considered. Conclusions: Health state self-reports may not be well considered. An immediate significant shift in response can be elicited by exposure to a mere description of an extreme health state despite no actual change in underlying health state occurring. Caution should be exercised in research and clinical settings when interpreting subjective patient reported outcomes that are dependent on brief anchors for meaning. Trial Registration: Australian and New Zealand Clinical Trials Registry (#ACTRN12607000606482) http://www.anzctr. org.au
Resumo:
Objective To assemble expected values for free-living steps/day in special populations living with chronic illnesses and disabilities. Method Studies identified since 2000 were categorized into similar illnesses and disabilities, capturing the original reference, sample descriptions, descriptions of instruments used (i.e., pedometers, piezoelectric pedometers, accelerometers), number of days worn, and mean and standard deviation of steps/day. Results Sixty unique studies represented: 1) heart and vascular diseases, 2) chronic obstructive lung disease, 3) diabetes and dialysis, 4) breast cancer, 5) neuromuscular diseases, 6) arthritis, joint replacement, and fibromyalgia, 7) disability (including mental retardation/intellectual difficulties), and 8) other special populations. A median steps/day was calculated for each category. Waist-mounted and ankle-mounted instruments were considered separately due to fundamental differences in assessment properties. For waist-mounted instruments, the lowest median values for steps/day are found in disabled older adults (1214 steps/day) followed by people living with COPD (2237 steps/day). The highest values were seen in individuals with Type 1 diabetes (8008 steps/day), mental retardation/intellectual disability (7787 steps/day), and HIV (7545 steps/day). Conclusion This review will be useful to researchers/practitioners who work with individuals living with chronic illness and disability and require such information for surveillance, screening, intervention, and program evaluation purposes. Keywords: Exercise; Walking; Ambulatory monitoring
Resumo:
Venous leg ulceration is a serious condition affecting 1 – 3% of the population. Decline in the function of the calf muscle pump is correlated with venous ulceration. Many previous studies have reported an improvement in the function of the calf muscle pump, endurance of the calf muscle and increased range of ankle motion after structured exercise programs. However, there is a paucity of published research that assesses if these improvements result in an improvement in the healing rates of venous ulcers. The primary purpose of this pilot study was to establish the feasibility of a homebased progressive resistance exercise program and examine if there was any clinical significance or trend toward healing. The secondary aims were to examine the benefit of a home-based progressive resistance exercise program on calf muscle pump function and physical parameters. The methodology used was a randomised controlled trial where eleven participants were randomised into an intervention (n = 6) or control group (n = 5). Participants who were randomised to receive a 12-week home-based progressive resistance exercise program were instructed through weekly face-to-face consultations during their wound clinic appointment by the author. Control group participants received standard wound care and compression therapy. Changes in ulcer parameters were measured fortnightly at the clinic (number healed at 12 weeks, percentage change in area and pressure ulcer score healing score). An air plethysmography test was performed at baseline and following the 12 weeks of training to determine changes in calf muscle pump function. Functional measures included maximum number of heel raises (endurance), maximal isometric plantar flexion (strength) and range of ankle motion (ROAM); these tests were conducted at baseline, week 6 and week 12. The sample for the study was drawn from the Princess Alexandra Hospital in Brisbane, Australia. Participants with venous leg ulceration who met the inclusion criteria were recruited. The participants were screened via duplex scanning and ankle brachial pressure index (ABPI) to ensure they did not have any arterial complications. Participants were excluded if there was evidence of cellulitis. Demographic data were obtained from each participant and details regarding medical history, quality of life and geriatric depression scores were collected at baseline. Both the intervention and control group were required to complete a weekly exercise diary to monitor activity levels between groups. To test for the effect of the intervention over time, a repeated measures analysis of variance was conducted on the major outcome variables. Group (intervention versus control) was the between subject factor and time (baseline, week 6, week 12) was the within subject or repeated measures factor. Due to the small sample size, further tests were conducted to check the assumptions of the statistical test to be used. The results showed that Mauchly.s Test, the Sphericity assumptions of repeated measures for ANOVA were met. Further tests of homogeneity of variance assumptions also confirmed that this assumption was met. Data analysis was conducted using the software package SPSS for Windows Release 17.0. The pilot study proved feasible with all of the intervention (n=6) participants continuing with the resistance program for the 12 week duration and no deleterious effects noted. Clinical significance was observed in the intervention group with a 32% greater change in ulcer size (p= 0.26) than the control group, and a 10% (p = 0.74) greater difference between the numbers healed compared to the control group. Statistical significance was observed for the ejection fraction (p = 0.05), residual volume fraction (p = 0.04) and ROAM (p = 0.01), which all improved significantly in the intervention group over time. These results are encouraging, nevertheless, further investigations seem warranted to examine the effect exercise has on the healing rates of venous leg ulcers, with a multistudy site, larger sample size and longer follow up period.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Background and aim Falls are the leading cause of injury in older adults. Identifying people at risk before they experience a serious fall requiring hospitalisation allows an opportunity to intervene earlier and potentially reduce further falls and subsequent healthcare costs. The purpose of this project was to develop a referral pathway to a community falls-prevention team for older people who had experienced a fall attended by a paramedic service and who were not transported to hospital. It was also hypothesised that providing intervention to this group of clients would reduce future falls-related ambulance call-outs, emergency department presentations and hospital admissions. Methods An education package, referral pathway and follow-up procedures were developed. Both services had regular meetings, and work shadowing with the paramedics was also trialled to encourage more referrals. A range of demographic and other outcome measures were collected to compare people referred through the paramedic pathway and through traditional pathways. Results Internal data from the Queensland Ambulance Service indicated that there were approximately six falls per week by community-dwelling older persons in the eligible service catchment area (south west Brisbane metropolitan area) who were attended to by Queensland Ambulance Service paramedics, but not transported to hospital during the 2-year study period (2008–2009). Of the potential 638 eligible patients, only 17 (2.6%) were referred for a falls assessment. Conclusion Although this pilot programme had support from all levels of management as well as from the service providers, it did not translate into actual referrals. Several explanations are provided for these preliminary findings.
Resumo:
Being a grandparent is an important and valued role for many older adults, who often have strong views about the type of grandparent they will be and what they will teach their grandchild. When their grandchild has a disability, grandparents may have to significantly adjust their expectations and interactions. This research explores if and how having a grandchild with a disability influences grandparents’ sense of identity and enactment of the grandparent role. Using qualitative purposive sampling, semi-structured interviews were conducted with 22 grandparents of children with an intellectual and/or physical disability residing in Brisbane, Australia. A thematic analysis identified three key themes characterising grandparent’s views: formation of grandparenting identity, styles of grandparenting, and role enactment. The results highlight the critical role of grandparents when a child has a disability, illustrating that the grandparenting experience and role enactment may be universal with only the context and delivery varying.
Resumo:
Purpose. The Useful Field of View (UFOV(R)) test has been shown to be highly effective in predicting crash risk among older adults. An important question which we examined in this study is whether this association is due to the ability of the UFOV to predict difficulties in attention-demanding driving situations that involve either visual or auditory distracters. Methods. Participants included 92 community-living adults (mean age 73.6 +/- 5.4 years; range 65-88 years) who completed all three subtests of the UFOV involving assessment of visual processing speed (subtest 1), divided attention (subtest 2), and selective attention (subtest 3); driving safety risk was also classified using the UFOV scoring system. Driving performance was assessed separately on a closed-road circuit while driving under three conditions: no distracters, visual distracters, and auditory distracters. Driving outcome measures included road sign recognition, hazard detection, gap perception, time to complete the course, and performance on the distracter tasks. Results. Those rated as safe on the UFOV (safety rating categories 1 and 2), as well as those responding faster than the recommended cut-off on the selective attention subtest (350 msec), performed significantly better in terms of overall driving performance and also experienced less interference from distracters. Of the three UFOV subtests, the selective attention subtest best predicted overall driving performance in the presence of distracters. Conclusions. Older adults who were rated as higher risk on the UFOV, particularly on the selective attention subtest, demonstrated poorest driving performance in the presence of distracters. This finding suggests that the selective attention subtest of the UFOV may be differentially more effective in predicting driving difficulties in situations of divided attention which are commonly associated with crashes.
Resumo:
Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.
Resumo:
Objectives: Malnutrition is common in older hospitalised patients, and barriers to adequate intake in hospital limit the effectiveness of hospital-based nutrition interventions. This pilot study was undertaken to determine whether nutrition-focussed care at discharge and in the early post-hospital period is feasible and acceptable to patients and carers, and improves nutritional status. Design: Prospective cohort study Setting: Internal medicine wards of a tertiary teaching hospital in Brisbane, Australia Participants: Patients aged 65 and older admitted for at least 3 days, identified as malnourished or at risk of malnutrition using Mini Nutritional Assessment (MNA). Interventions: An interdisciplinary discharge team (specialist discharge planning nurse and accredited practicing dietitian) provided nutrition-focussed education, advice, service coordination and follow-up (home visits and telephone) for 6 weeks following hospitalisation Measurements: Nutritional intake, weight, functional status and MNA were recorded 6 and 12 weeks after discharge. Service intensity and changes to care were noted, and hospital readmissions recorded. Service feedback from patients and carers was sought using a brief questionnaire. Results: 12 participants were enrolled during the 6 week pilot (mean age 82 years, 50% male). All received 1-2 home visits, and 3-8 telephone calls. Four participants had new community services arranged, 4 were commenced on oral nutritional supplements, and 7 were referred to community dietetics services for follow-up. Two participants had a decline in MNA score of more than 10% at 12 week follow-up, while the remainder improved by at least 10%. Individualised care including community service coordination was valued by participants. Conclusion: The proposed model of care for older adults was feasible, acceptable to patients and carers, and associated with improved nutritional status at 12 weeks for most participants. The pilot data will be useful for design of intervention trials.