857 resultados para Impairment
Resumo:
Objective Substance-related expectancies are associated with substance use and post-substance use thoughts, feelings and behaviours. The expectancies held by specific cultural or sub-cultural groups have rarely been investigated. This research maps expectancies specific to gay and other men who have sex with men (MSM) and their relationship with substance patterns and behaviours following use, including sexual practices (e.g., unprotected anal intercourse). This study describes the development of a measure of such beliefs for cannabis, the Cannabis Expectancy Questionnaire for Men who have Sex with Men (CEQ-MSM). Method Items selected through a focus group and interviews were piloted on 180 self-identified gay or other MSM via an online questionnaire. Results Factor analysis revealed six distinct substance reinforcement domains (“Enhanced sexual experience”, “Sexual negotiation”, “Cognitive impairment”, “Social and emotional facilitation”, “Enhanced sexual desire”, and “Sexual inhibition”). The scale was associated with consumption patterns of cannabis, and in a crucial test of discriminant validity not with the consumption of alcohol or stimulants. Conclusions The CEQ-MSM represents a reliable and valid measure of outcome expectancies, related to cannabis among MSM. Future applications of the CEQ-MSM in health promotion, clinical settings and research may contribute to reducing harm associated with substance use among MSM, including HIV transmission.
Resumo:
Background: Caring for family members with dementia can be a long-term, burdensome task resulting in physical and emotional distress and impairment. Research has demonstrated significantly lower levels of selfefficacy among family caregivers of people with dementia (CGs) than caregivers of relatives with non-dementia diseases. Intervention studies have also suggested that the mental and physical health of dementia CGs could be improved through the enhancement of their self-efficacy. However, studies are limited in terms of the influences of caregiver self-efficacy on caregiver behaviour, subjective burden and health-related quality of life. Of particular note is that there are no studies on the applicability of caregiver self-efficacy in the social context of China. Objective: The purpose of this thesis was to undertake theoretical exploration using Bandura’s (1997) self-efficacy theory to 1) revise the Revised Caregiving Self-Efficacy Scale (C-RCSES) (Steffen, McKibbin, Zeiss, Gallagher-Thompson, & Bandura, 2002), and 2) explore determinants of caregiver self-efficacy and the role of caregiver self-efficacy and other conceptual constructs (including CGs’ socio-demographic characteristics, CRs’ impairment and CGs’ social support) in explaining and predicting caregiver behaviour, subjective burden and health-related quality of life among CGs in China. Methodology: Two studies were undertaken: a qualitative elicitation study with 10 CGs; and a cross-sectional survey with 196 CGs. In the first study, semi-structured interviews were conducted to explore caregiver behaviours and corresponding challenges for their performance. The findings of the study assisted in the development of the initial items and domains of the Chinese version of the Revised Caregiving Self-Efficacy Scale (C-RCSES). Following changes to items in the scale, the second study, a cross-sectional survey with 196 CGs was conducted to evaluate the psychometric properties of C-RCSES and to test a hypothesised self-efficacy model of family caregiving adapted from Bandura’s theory (1997). Results: 35 items were generated from the qualitative data. The content validity of the C-RCSES was assessed and ensured in Study One before being used for the cross-sectional survey. Eight items were removed and five subscales (caregiver self-efficacy for gathering information about treatment, symptoms and health care; obtaining support; responding to problematic behaviours; management of household, personal and medical care; and controlling upsetting thoughts about caregiving) were identified after principal component factor analysis on the cross-sectional survey data. The reliability of the scale is acceptable: the Cronbach’s alpha coefficients for the whole scale and for each subscale were all over .80; and the fourweek test-retest reliabilities for the whole scale and for each subscale ranged from .64 to .85. The concurrent, convergent and divergent validity were also acceptable. CGs reported moderate levels of caregiver self-efficacy. Furthermore, the level of self-efficacy for management of household, personal and medical care was relatively high in comparison to those of the other four domains of caregiver self-efficacy. Caregiver self-efficacy was also significantly influenced by CGs’ socio-demographic characteristics and the caregiving external factors (CR impairment and social support that CGs obtained). The level of caregiver behaviour that CGs reported was higher than that reported in other Chinese research. CGs’ socio-demographics significantly influenced caregiver behaviour, whereas caregiver self-efficacy did not influence caregiver behaviour. Regarding the two external factors, CGs who cared for highly impaired relatives reported high levels of caregiver behaviour, but social support did not influence caregiver behaviour. Regarding caregiver subjective burden and health-related quality of life, CGs reported moderate levels of subjective burden, and their level of healthrelated quality of life was significantly lower than that of the general population in China. The findings also indicated that CGs’ subjective burden and health-related quality of life were influenced by all major factors in the hypothesised model, including CGs’ socio-demographics, CRs’ impairment, social support that CGs obtained, caregiver self-efficacy and caregiver behaviour. Of these factors, caregiver self-efficacy and social support significantly improved their subjective burden and health-related quality of life; whereas caregiver behaviour and CRs’ impairment were detrimental to CGs, such as increasing subjective burden and worsening health-related quality of life. Conclusion: While requiring further exploration, the qualitative study was the first qualitative research conducted in China to provide an in-depth understanding of CGs’ caregiving experience, including their major caregiver behaviours and the corresponding challenges. Meanwhile, although the C-RCSES needs further psychometric testing, it is a useful tool for assessing caregiver self-efficacy in Chinese populations. Results of the qualitative and quantitative study provide useful information for future studies regarding the explanatory power of caregiver self-efficacy to caregiver behaviour, subjective burden and health-related quality of life. Additionally, integrated with Bandura’s theory, the findings from the quantitative study also suggested a further study exploring the role of outcome expectations in caregiver behaviour, subjective burden and healthrelated quality of life.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.
Resumo:
Thirty-four elementary school teachers and 32 education students from Canada rated their reactions towards vignettes describing children who met attention-deficit/hyperactivity disorder (ADHD) symptom criteria that included or did not include the label “ADHD.” “ADHD”-labeled vignettes elicited greater perceptions of the child's impairment as well as more negative emotions and less confidence in the participants, although it also increased participants' willingness to implement treatment interventions. Ratings were similar to vignettes of boys versus girls; however, important differences in ratings between teachers and education students emerged and are discussed. Finally, we investigated the degree to which teachers' professional backgrounds influenced bias based on the label “ADHD.” Training specific to ADHD consistently predicted label bias, whereas teachers' experience working with children with ADHD did not.
Resumo:
Introduction and Aims. Alcohol expectancies are associated with drinking behaviour and post-drinking use thoughts, feelings and behaviours. The expectancies held by specific cultural or sub-cultural groups have rarely been investigated. This research maps expectancies specific to gay and other men who have sex with men (MSM) and their relationship with substance use. This study describes the specific development of a measure of such beliefs for alcohol, the Drinking Expectancy Questionnaire for Men who have Sex with Men (DEQ-MSM). Design and Methods. Items selected through a focus group and interviews were piloted on 220 self-identified gay or other MSM via an online questionnaire. Results. Factor analysis revealed three distinct substance reinforcement domains ('Cognitive impairment', 'Sexual activity' and 'Social and emotional facilitation'). These factors were associated with consumption patterns of alcohol, and in a crucial test of discriminant validity were not associated with the consumption of cannabis or stimulants. Similarities and differences with existing measures will also be discussed. Discussion and Conclusions. The DEQ-MSM represents a reliable and valid measure of outcome expectancies, related to alcohol use among MSM, and represents an important advance as no known existing alcohol expectancy measure, to date, has been developed and/or normed for use among this group. Future applications of the DEQ-MSM in health promotion, clinical settings and research may contribute to reducing harm associated with alcohol use among MSM, including the development of alcohol use among young gay men.
Resumo:
Abstract OBJECTIVE: To assess the psychometric properties and health correlates of the Geriatric Anxiety Inventory (GAI) in a cohort of Australian community-residing older women. METHOD: Cross-sectional study of a population-based cohort of women aged 60 years and over (N = 286). RESULTS: The GAI exhibited sound internal consistency and demonstrated good concurrent validity against the state half of the Spielberger State Trait Anxiety Inventory and the neuroticism domain of the NEO five-factor inventory. GAI score was significantly associated with self-reported sleep difficulties and perceived memory impairment, but not with age or cognitive function. Women with current DSM-IV Generalized Anxiety Disorder (GAD) had significantly higher GAI scores than women without such a history. In this cohort, the optimal cut-point to detect current GAD was 8/9. Although the GAI was designed to have few somatic items, women with a greater number of general medical problems or who rated their general health as worse had higher GAI scores. CONCLUSION: The GAI is a new scale designed specifically to measure anxiety in older people. In this Australian cohort of older women, the instrument had sound psychometric properties.
Resumo:
PKU is a genetically inherited inborn error of metabolism caused by a deficiency of the enzyme phenylalanine hydroxylase. The failure of this enzyme causes incomplete metabolism of protein ingested in the diet, specifically the conversion of one amino acid, phenylalanine, to tyrosine, which is a precursor to the neurotransmitter dopamine. Rising levels of phenylalanine is toxic to the developing brain, disrupting the formation of white matter tracts. The impact of tyrosine deficiency is not as well understood, but is hypothesized to lead to a low dopamine environment for the developing brain. Detection in the newborn period and continuous treatment (a low protein phe-restricted diet supplemented with phenylalanine-free protein formulas) has resulted in children with early and continuously treated PKU now developing normal I.Q. However, deficits in executive function (EF) are common, leading to a rate of Attention Deficit Hyperactivity Disorder (ADHD) up to five times the norm. EF worsens with exposure to higher phenylalanine levels, however recent research has demonstrated that a high phenylalanine to tyrosine ratio (phenylalanine:tyrosine ratio), which is hypothesised to lead to poorer dopamine function, has a more negative impact on EF than phenylalanine levels alone. Research and treatment of PKU is currently phenylalanine-focused, with little investigation of the impact of tyrosine on neuropsychological development. There is no current consensus as to the veracity of tyrosine monitoring or treatment in this population. Further, the research agenda in this population has demonstrated a primary focus on EF impairment alone, even though there may be additional neuropsychological skills compromised (e.g., mood, visuospatial deficits). The aim of this PhD research was to identify residual neuropsychological deficits in a cohort of children with early and continuously treated phenylketonuria, at two time points in development (early childhood and early adolescence), separated by eight years. In addition, this research sought to determine which biochemical markers were associated with neuropsychological impairments. A clinical practice survey was also undertaken to ascertain the current level of monitoring/treatment of tyrosine in this population. Thirteen children with early and continuously treated PKU were tested at mean age 5.9 years and again at mean age 13.95 years on several neuropsychological measures. Four children with hyperphenylalaninemia (a milder version of PKU) were also tested at both time points and provide a comparison group in analyses. Associations between neuropsychological function and biochemical markers were analysed. A between groups analysis in adolescence was also conducted (children with PKU compared to their siblings) on parent report measures of EF and mood. Minor EF impairments were evident in the PKU group by age 6 years and these persisted into adolescence. Life-long exposure to high phenylalanine:tyrosine ratio and/or low tyrosine independent of phenylalanine were significantly associated with EF impairments at both time points. Over half the children with PKU showed severe impairment on a visuospatial task, and this was associated only with concurrent levels of tyrosine in adolescence. Children with PKU also showed a statistically significant decline in a language comprehension task from 6 years to adolescence (going from normal to subnormal), this deficit was associated with lifetime levels of phenylalanine. In comparison, the four children with hyperphenylalaninemia demonstrated normal function at both time points, across all measures. No statistically significant differences were detected between children with PKU and their siblings on the parent report of EF and mood. However, depressive symptoms were significantly correlated with: EF; long term high phe:tyr exposure; and low tyrosine levels independent of phenylalanine. The practice survey of metabolic clinics from 12 countries indicated a high level of variability in terms of monitoring/treatment of tyrosine in this population. Whilst over 80% of clinics surveyed routinely monitored tyrosine levels in their child patients, 25% reported treatment strategies to increase tyrosine (and thereby lower the phenylalanine:tyrosine ratio) under a variety of patient presentation conditions. Overall, these studies have shown that EF impairments associated with PKU provide support for the dopamine-deficiency model. A language comprehension task showed a different trajectory, serving a timely reminder that non-EF functions also remain vulnerable in this population; and that normal function in childhood does not guarantee normal function by adolescence. Mood impairments were associated with EF impairments as well as long term measures of phenylalanine:tyrosine and/or tyrosine. The implications of this research for enhanced clinical guidelines are discussed given varied current practice.
Resumo:
Background: It is predicted that China will have the largest number of cases of dementia in the world by 2025 (Ferri et al., 2005). Research has demonstrated that caring for family members with dementia can be a long-term, burdensome activity resulting in physical and emotional distress and impairment (Pinquart & Sorensen, 2003b). The establishment of family caregiver supportive services in China can be considered urgent; and the knowledge of the caregiving experience and related influencing factors is necessary to inform such services. Nevertheless, in the context of rapid demographic and socioeconomic change, the impact of caregiving for rural and urban Chinese adult-child caregivers may be different, and different needs in supportive services may therefore be expected. Objectives: The aims of this research were 1) to examine the potential differences existing in the caregiving experience between rural and urban adult-child caregivers caring for parents with dementia in China; and 2) to examine the potential differences existing in the influencing factors of the caregiving experience for rural as compared with urban adult-child caregivers caring for parents with dementia in China. Based on the literature review and Kramer.s (1997) caregiver adaptation model, six concepts and their relationships of caregiving experience were studied: severity of the care receivers. dementia, caregivers. appraisal of role strain and role gain, negative and positive well-being outcomes, and health related quality of life. Furthermore, four influencing factors (i.e., filial piety, social support, resilience, and personal mastery) were studied respectively. Methods: A cross-sectional, comparative design was used to achieve the aims of the study. A questionnaire, which was designed based on the literature review and on Kramer.s (1997) caregiver adaptation model, was completed by 401 adult-child caregivers caring for their parents with dementia from the mental health outpatient departments in five hospitals in the Yunnan province, P.R. China. Structural equation modelling (SEM) was employed as the main statistical technique for data analyses. Other statistical techniques (e.g., t-tests and Chi-Square tests) were also conducted to compare the demographic characteristics and the measured variables between rural and urban groups. Results: For the first research aim, the results indicated that urban adult-child caregivers in China experienced significantly greater strain and negative well-being outcomes than their rural peers; whereas, the difference on the appraisal of role gain and positive outcomes was nonsignificant between the two groups. The results also indicated that the amounts of severity of care receivers. dementia and caregivers. health related quality of life do not have the same meanings between the two groups. Thus, the levels of these two concepts were not comparable between the rural and urban groups in this study. Moreover, the results also demonstrated that the negative direct effect of gain on negative outcomes in urban caregivers was stronger than that in rural caregivers, suggesting that the urban caregivers tended to use appraisal of role gain to protect themselves from negative well-being outcomes to a greater extent. In addition, the unexplained variance in strain in the urban group was significantly more than that in the rural group, suggesting that there were other unmeasured variables besides the severity of care receivers. dementia which would predict strain in urban caregivers compared with their rural peers. For the second research aim, the results demonstrated that rural adult-child caregivers reported a significantly higher level of filial piety and more social support than their urban counterparts, although the two groups did not significantly differ on the levels of their resilience and personal mastery. Furthermore, although the mediation effects of these four influencing factors on both positive and negative aspects remained constant across rural and urban adult-child caregivers, urban caregivers tended to be more effective in using personal mastery to protect themselves from role strain than rural caregivers, which in turn protects them more from the negative well-being outcomes than was the case with their rural peers. Conclusions: The study extends the application of Kramer.s caregiving adaptation process model (Kramer, 1997) to a sample of adult-child caregivers in China by demonstrating that both positive and negative aspects of caregiving may impact on the caregiver.s health related quality of life, suggesting that both aspects should be targeted in supportive interventions for Chinese family caregivers. Moreover, by demonstrating partial mediation effects, the study provides four influencing factors (i.e., filial piety, social support, resilience, and personal mastery) as specific targets for clinical interventions. Furthermore, the study found evidence that urban adult-child caregivers had more negative but similar positive experience compared to their rural peers, suggesting that the establishment of supportive services for urban caregivers may be more urgent at present stage in China. Additionally, since urban caregivers tended to use appraisal of role gain and personal mastery to protect themselves from negative well-being outcomes than rural caregivers to a greater extend, interventions targeting utility of gain or/and personal mastery to decrease negative outcomes might be more effective in urban caregivers than in rural caregivers. On the other hand, as cultural expectations and expression of filial piety tend to be more traditional in rural areas, interventions targeting filial piety could be more effective among rural caregivers. Last but not least, as rural adult-child caregivers have more existing natural social support than their urban counterparts, mobilising existing natural social support resources may be more beneficial for rural caregivers, whereas, formal supports (e.g., counselling services, support groups and adult day care centres) should be enhanced for urban caregivers.
Resumo:
PURPOSE: To examine the visual predictors of falls and injurious falls among older adults with glaucoma. METHODS: Prospective falls data were collected for 71 community-dwelling adults with primary open-angle glaucoma, mean age 73.9 ± 5.7 years, for one year using monthly falls diaries. Baseline assessment of central visual function included high-contrast visual acuity and Pelli-Robson contrast sensitivity. Binocular integrated visual fields were derived from monocular Humphrey Field Analyser plots. Rate ratios (RR) for falls and injurious falls with 95% confidence intervals (CIs) were based on negative binomial regression models. RESULTS: During the one year follow-up, 31 (44%) participants experienced at least one fall and 22 (31%) experienced falls that resulted in an injury. Greater visual impairment was associated with increased falls rate, independent of age and gender. In a multivariate model, more extensive field loss in the inferior region was associated with higher rate of falls (RR 1.57, 95%CI 1.06, 2.32) and falls with injury (RR 1.80, 95%CI 1.12, 2.98), adjusted for all other vision measures and potential confounding factors. Visual acuity, contrast sensitivity, and superior field loss were not associated with the rate of falls; topical beta-blocker use was also not associated with increased falls risk. CONCLUSIONS: Falls are common among older adults with glaucoma and occur more frequently in those with greater visual impairment, particularly in the inferior field region. This finding highlights the importance of the inferior visual field region in falls risk and assists in identifying older adults with glaucoma at risk of future falls, for whom potential interventions should be targeted. KEY WORDS: glaucoma, visual field, visual impairment, falls, injury
Resumo:
Muscle physiologists often describe fatigue simply as a decline of muscle force and infer this causes an athlete to slow down. In contrast, exercise scientists describe fatigue during sport competition more holistically as an exercise-induced impairment of performance. The aim of this review is to reconcile the different views by evaluating the many performance symptoms/measures and mechanisms of fatigue. We describe how fatigue is assessed with muscle, exercise or competition performance measures. Muscle performance (single muscle test measures) declines due to peripheral fatigue (reduced muscle cell force) and/or central fatigue (reduced motor drive from the CNS). Peak muscle force seldom falls by >30% during sport but is often exacerbated during electrical stimulation and laboratory exercise tasks. Exercise performance (whole-body exercise test measures) reveals impaired physical/technical abilities and subjective fatigue sensations. Exercise intensity is initially sustained by recruitment of new motor units and help from synergistic muscles before it declines. Technique/motor skill execution deviates as exercise proceeds to maintain outcomes before they deteriorate, e.g. reduced accuracy or velocity. The sensation of fatigue incorporates an elevated rating of perceived exertion (RPE) during submaximal tasks, due to a combination of peripheral and higher CNS inputs. Competition performance (sport symptoms) is affected more by decision-making and psychological aspects, since there are opponents and a greater importance on the result. Laboratory based decision making is generally faster or unimpaired. Motivation, self-efficacy and anxiety can change during exercise to modify RPE and, hence, alter physical performance. Symptoms of fatigue during racing, team-game or racquet sports are largely anecdotal, but sometimes assessed with time-motion analysis. Fatigue during brief all-out racing is described biomechanically as a decline of peak velocity, along with altered kinematic components. Longer sport events involve pacing strategies, central and peripheral fatigue contributions and elevated RPE. During match play, the work rate can decline late in a match (or tournament) and/or transiently after intense exercise bursts. Repeated sprint ability, agility and leg strength become slightly impaired. Technique outcomes, such as velocity and accuracy for throwing, passing, hitting and kicking, can deteriorate. Physical and subjective changes are both less severe in real rather than simulated sport activities. Little objective evidence exists to support exercise-induced mental lapses during sport. A model depicting mind-body interactions during sport competition shows that the RPE centre-motor cortex-working muscle sequence drives overall performance levels and, hence, fatigue symptoms. The sporting outputs from this sequence can be modulated by interactions with muscle afferent and circulatory feedback, psychological and decision-making inputs. Importantly, compensatory processes exist at many levels to protect against performance decrements. Small changes of putative fatigue factors can also be protective. We show that individual fatigue factors including diminished carbohydrate availability, elevated serotonin, hypoxia, acidosis, hyperkalaemia, hyperthermia, dehydration and reactive oxygen species, each contribute to several fatigue symptoms. Thus, multiple symptoms of fatigue can occur simultaneously and the underlying mechanisms overlap and interact. Based on this understanding, we reinforce the proposal that fatigue is best described globally as an exercise-induced decline of performance as this is inclusive of all viewpoints.
Resumo:
OBJECTIVES: To identify the prevalence of geriatric syndromes in the premorbid for all syndromes except falls (preadmission), admission, and discharge assessment periods and the incidence of new and significant worsening of existing syndromes at admission and discharge. DESIGN: Prospective cohort study. SETTING: Three acute care hospitals in Brisbane, Australia. PARTICIPANTS: Five hundred seventy-seven general medical patients aged 70 and older admitted to the hospital. MEASUREMENTS: Prevalence of syndromes in the premorbid (or preadmission for falls), admission, and discharge periods; incidence of new syndromes at admission and discharge; and significant worsening of existing syndromes at admission and discharge. RESULTS: The most frequently reported premorbid syndromes were bladder incontinence (44%), impairment in any activity of daily living (ADL) (42%). A high proportion (42%) experienced at least one fall in the 90 days before admission. Two-thirds of the participants experienced between one and five syndromes (cognitive impairment, dependence in any ADL item, bladder and bowel incontinence, pressure ulcer) before, at admission, and at discharge. A majority experienced one or two syndromes during the premorbid (49.4%), admission (57.0%), or discharge (49.0%) assessment period.The syndromes with a higher incidence of significant worsening at discharge (out of the proportion with the syndrome present premorbidly) were ADL limitation (33%), cognitive impairment (9%), and bladder incontinence (8%). Of the syndromes examined at discharge, a higher proportion of patients experienced the following new syndromes at discharge (absent premorbidly): ADL limitation (22%); and bladder incontinence (13%). CONCLUSION: Geriatric syndromes were highly prevalent. Many patients did not return to their premorbid function and acquired new syndromes.
Resumo:
The main aim of this paper is to outline a proposed program of research which will attempt to quantify the extent of the problem of alcohol and other drugs in the Australian construction industry, and furthermore, develop an appropriate industry-wide policy and cultural change management program and implementation plan to address the problem. This paper will also present preliminary results from the study. The study will use qualitative and quantitative methods (in the form of interviews and surveys, respectively) to evaluate the extent of the problem of alcohol and other drug use in this industry, to ascertain the feasibility of an industry-wide policy and cultural change management program, and to develop an appropriate implementation plan. The study will be undertaken in several construction organisations, at selected sites in South Australia, Victoria and Northern Territory. It is anticipated that approximately 500 employees from the participating organisations across Australia will take part in the study. The World Health Organisation’s Alcohol Use Disorders Identification Test (AUDIT) will be used to measure the extent of alcohol use in the industry. Illicit drug use, ‘‘readiness to change’’, impediments to reducing impairment, feasibility of proposed interventions, and employee attitudes and knowledge regarding workplace AOD impairment, will also be measured through a combination of interviews and surveys. Among the preliminary findings, for 51% (n=127) of respondents, score on the AUDIT indicated alcohol use at hazardous levels. Of the respondents who were using alcohol at hazardous levels, 76% reported (n97) that they do not have a problem with drinking and 54% (n=68) reported that it would be easy to ‘‘cut down’’ or stop drinking. Nearly half (49%) of all respondents (n=122) had used marijuana/cannabis at some time prior to being surveyed. The use of other illicit substances was much less frequently reported. Preliminary interview findings indicated a lack of adequate employee knowledge regarding the physical effects of alcohol and other drugs in the workplace. As for conclusions, the proposed study will address a major gap in the literature with regard to the extent of the problem of alcohol and other drug use in the construction industry in Australia. The study will also develop and implement a national, evidence-based workplace policy, with the aim of mitigating the deleterious effects of alcohol and other drugs in this industry.
Resumo:
Purpose: To determine the effect of moderate levels of refractive blur and simulated cataracts on nighttime pedestrian conspicuity in the presence and absence of headlamp glare. Methods: The ability to recognize pedestrians at night was measured in 28 young adults (M=27.6 years) under three visual conditions: normal vision, refractive blur and simulated cataracts; mean acuity was 20/40 or better in all conditions. Pedestrian recognition distances were recorded while participants drove an instrumented vehicle along a closed road course at night. Pedestrians wore one of three clothing conditions and oncoming headlamps were present for 16 participants and absent for 12 participants. Results: Simulated visual impairment and glare significantly reduced the frequency with which drivers recognized pedestrians and the distance at which the drivers first recognized them. Simulated cataracts were significantly more disruptive than blur even though photopic visual acuity levels were matched. With normal vision, drivers responded to pedestrians at 3.6x and 5.5x longer distances on average than for the blur or cataract conditions, respectively. Even in the presence of visual impairment and glare, pedestrians were recognized more often and at longer distances when they wore a “biological motion” reflective clothing configuration than when they wore a reflective vest or black clothing. Conclusions: Drivers’ ability to recognize pedestrians at night is degraded by common visual impairments even when the drivers’ mean visual acuity meets licensing requirements. To maximize drivers’ ability to see pedestrians, drivers should wear their optimum optical correction, and cataract surgery should be performed early enough to avoid potentially dangerous reductions in visual performance.