622 resultados para adults with developmental disability
Resumo:
Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.
Resumo:
Background The number of middle-aged working individuals being diagnosed with cancer is increasing and so too will disruptions to their employment. The aim of the Working After Cancer Study is to examine the changes to work participation in the 12 months following a diagnosis of primary colorectal cancer. The study will identify barriers to work resumption, describe limitations on workforce participation, and evaluate the influence of these factors on health-related quality of life. Methods/Design An observational population-based study has been designed involving 260 adults newly-diagnosed with colorectal cancer between January 2010 and September 2011 and who were in paid employment at the time they were diagnosed. These cancer cases will be compared to a nationally representative comparison group of 520 adults with no history of cancer from the general population. Eligible cases will have a histologically confirmed diagnosis of colorectal cancer and will be identified through the Queensland Cancer Registry. Data on the comparison group will be drawn from the Household, Income and Labour Dynamics in Australia (HILDA) Survey. Data collection for the cancer group will occur at 6 and 12 months after diagnosis, with work questions also asked about the time of diagnosis, while retrospective data on the comparison group will be come from HILDA Waves 2009 and 2010. Using validated instruments administered via telephone and postal surveys, data will be collected on socio-demographic factors, work status and circumstances, and health-related quality of life (HRQoL) for both groups while the cases will have additional data collected on cancer treatment and symptoms, work productivity and cancer-related HRQoL. Primary outcomes include change in work participation at 12 months, time to work re-entry, work limitations and change in HRQoL status. Discussion This study will address the reasons for work cessation after cancer, the mechanisms people use to remain working and existing workplace support structures and the implications for individuals, families and workplaces. It may also provide key information for governments on productivity losses.
Resumo:
Aims: To investigate the relationship between retinal nerve fibre layer thickness and peripheral neuropathy in patients with Type 2 diabetes, particularly in those who are at higher risk of foot ulceration. Methods: Global and sectoral retinal nerve fibre layer thicknesses were measured at 3.45 mm diameter around the optic nerve head using optical coherence tomography (OCT). The level of neuropathy was assessed in 106 participants (82 with Type 2 diabetes and 24 healthy controls) using the 0–10 neuropathy disability score. Participants were stratified into four neuropathy groups: none (0–2), mild (3–5), moderate (6–8), and severe (9–10). A neuropathy disability score ≥ 6 was used to define those at higher risk of foot ulceration. Multivariable regression analysis was performed to assess the effect of neuropathy disability scores, age, disease duration and retinopathy on RNFL thickness. Results: Inferior (but not global or other sectoral) retinal nerve fibre layer thinning was associated with higher neuropathy disability scores (P = 0.03). The retinal nerve fibre layer was significantly thinner for the group with neuropathy disability scores ≥ 6 in the inferior quadrant (P < 0.005). Age, duration of disease and retinopathy levels did not significantly influence retinal nerve fibre layer thickness. Control participants did not show any significant differences in thickness measurements from the group with diabetes and no neuropathy (P > 0.24 for global and all sectors). Conclusions: Inferior quadrant retinal nerve fibre layer thinning is associated with peripheral neuropathy in patients with Type 2 diabetes, and is more pronounced in those at higher risk of foot ulceration.
Resumo:
In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.
Resumo:
Current discussions regarding the relationship between welfare governance systems and employment promotion in disability policy appeal to a rejuvenated neo-liberal and paternalistic understanding of welfare governance. At the core of this rationality is the argument that people with disabilities not only have rights, but also duties, in relation to the State. In the Australia welfare system, policy tools are deployed to produce a form of self-discipline, whereby the State emphasises personal responsibility via assessment tools, ‘mutual obligation’ policy, and motivational strategies. Drawing on a two-year semi-longitudinal study with 80 people with a disability accessing welfare benefits, we examine how welfare governance subject recipients to strategies to produce productive citizens who are able to contribute to the national goal of maintaining competitiveness in the global economy. Participants’ interviews reveal the intended and unintended effects of this activation policy, including some acceptance of the logic of welfare-to-work and counter-hegemonic resistance to de-valued social identities.
Resumo:
There is growing and converging evidence that cannabis may be a major risk factor in people with psychotic disorders and prodromal psychotic symptoms. The lack of available pharmacological treatments for cannabis use indicates that psychological interventions should be a high priority, especially among people with psychotic disorders. However, there have been few randomised controlled trials (RCTs) of psychological interventions among this group. In the present study we critically overview RCTs of psychological and pharmacologic interventions among people with psychotic disorders, giving particular attention to those studies which report cannabis use outcomes. We then review data regarding treatment preferences among this group. RCTs of interventions within "real world" mental health systems among adults with severe mental disorders suggest that cannabis use is amenable to treatment in real world settings among people with psychotic disorders. RCTs of manual guided interventions among cannabis users indicate that while brief interventions are associated with reductions in cannabis use, longer interventions may be more effective. Additionally, RCTs reviewed suggest treatment with antipsychotic medication is not associated with a worsening of cannabis cravings or use and may be beneficial. The development of cannabinoid agonist medication may be an effective strategy for cannabis dependence and suitable for people with psychotic disorders. The development of cannabis use interventions for people with psychotic disorders should also consider patients' treatment preferences. Initial results indicate face-to-face interventions focussed on cannabis use may be preferred. Further research investigating the treatment preferences of people with psychotic disorders using cannabis is needed.
Resumo:
Background: Chronic leg ulcers cause long term ill-health for older adults and the condition places a significant burden on health service resources. Although evidence on effective management of the condition is available, a significant evidence-practice gap is known to exist, with many suggested reasons e.g. multiple care providers, costs of care and treatments. This study aimed to identify effective health service pathways of care which facilitated evidence-based management of chronic leg ulcers. Methods: A sample of 70 patients presenting with a lower limb leg or foot ulcer at specialist wound clinics in Queensland, Australia were recruited for an observational study and survey. Retrospective data were collected on demographics, health, medical history, treatments, costs and health service pathways in the previous 12 months. Prospective data were collected on health service pathways, pain, functional ability, quality of life, treatments, wound healing and recurrence outcomes for 24 weeks from admission. Results: Retrospective data indicated that evidence based guidelines were poorly implemented prior to admission to the study, e.g. only 31% of participants with a lower limb ulcer had an ABPI or duplex assessment in the previous 12 months. On average, participants accessed care 2–3 times/week for 17 weeks from multiple health service providers in the twelve months before admission to the study clinics. Following admission to specialist wound clinics, participants accessed care on average once per week for 12 weeks from a smaller range of providers. The median ulcer duration on admission to the study was 22 weeks (range 2–728 weeks). Following admission to wound clinics, implementation of key indicators of evidence based care increased (p<0.001) and Kaplan-Meier survival analysis found the median time to healing was 12 weeks (95% CI 9.3–14.7). Implementation of evidence based care was significantly related to improved healing outcomes (p<0.001). Conclusions: This study highlights the complexities involved in accessing expertise and evidence based wound care for adults with chronic leg or foot ulcers. Results demonstrate that access to wound management expertise can promote streamlined health services and evidence based wound care, leading to efficient use of health resources and improved health.
Resumo:
Background Overweight and obesity has become a serious public health problem in many parts of the world. Studies suggest that making small changes in daily activity levels such as “breaking-up” sedentary time (i.e., standing) may help mitigate the health risks of sedentary behavior. The aim of the present study was to examine time spent in standing (determined by count threshold), lying, and sitting postures (determined by inclinometer function) via the ActiGraph GT3X among sedentary adults with differing weight status based on body mass index (BMI) categories. Methods Participants included 22 sedentary adults (14 men, 8 women; mean age 26.5 ± 4.1 years). All subjects completed the self-report International Physical Activity Questionnaire to determine time spent sitting over the previous 7 days. Participants were included if they spent seven or more hours sitting per day. Postures were determined with the ActiGraph GT3X inclinometer function. Participants were instructed to wear the accelerometer for 7 consecutive days (24 h a day). BMI was categorized as: 18.5 to <25 kg/m2 as normal, 25 to <30 kg/m2 as overweight, and ≥30 kg/m2 as obese. Results Participants in the normal weight (n = 10) and overweight (n = 6) groups spent significantly more time standing (after adjustment for moderate-to-vigorous intensity physical activity and wear-time) (6.7 h and 7.3 h respectively) and less time sitting (7.1 h and 6.9 h respectively) than those in obese (n = 6) categories (5.5 h and 8.0 h respectively) after adjustment for wear-time (p < 0.001). There were no significant differences in standing and sitting time between normal weight and overweight groups (p = 0.051 and p = 0.670 respectively). Differences were not significant among groups for lying time (p = 0.55). Conclusion This study described postural allocations standing, lying, and sitting among normal weight, overweight, and obese sedentary adults. The results provide additional evidence for the use of increasing standing time in obesity prevention strategies.
Resumo:
The present study examined the historical basis of the Australian disability income support system from 1908 to 2007. Although designed as a safety net for people with a disability, the disability income support system within Australia has been highly targeted. The original eligibility criteria of "permanently incapacitated for work", medical criteria and later "partially capacitated for work" potentially contained ideological inferences that permeated across the time period. This represents an important area for study given the potential consequence for disability income support to marginalise people with a disability. Social policy and disability policy theorists, including Saunders (2007, Social Policy Research Centre [SPRC]) and Gibilisco (2003) have provided valuable insight into some of the effects of disability policy and poverty. Yet while these theorists argued for some form of income support they did not propose a specific form of income security for further exploration. Few studies have undertaken a comprehensive review of the history of disability income support within the Australian context. This thesis sought to redress these gaps by examining disability income support policy within Australia. The research design consisted of an in-depth critical historical-comparative policy analysis methodology. The use of critical historical-comparative policy analysis allowed the researcher to trace the construction of disability within the Australian disability income support policy across four major historical epochs. A framework was developed specifically to guide analysis of the data. The critical discourse analysis method helped to understand the underlying ideological dimensions that led to the predominance of one particular approach over another. Given this, the research purpose of the study centred on: i. Tracing the history of the Australian disability income support system. ii. Examining the historical patterns and ideological assumptions over time. iii. Exploring the historical patterns and ideological assumptions underpinning an alternative model (Basic Income) and the extent to which each model promotes the social citizenship of people with a disability. The research commitment to a social-relational ontology and the quest for social change centred on the idea that "there has to be a better way" in the provision of disability income support. This theme of searching for an alternative reality in disability income support policy resonated throughout the thesis. This thesis found that the Australian disability income support system is disabling in nature and generates categories of disability on the basis of ableness. From the study, ableness became a condition for citizenship. This study acknowledged that, in reality, income support provision reflects only one aspect of the disabling nature of society which requires redressing. Although there are inherent tensions in any redistributive strategy, the Basic Income model potentially provides an alternative to the Australian disability income support system, given its grounding in social citizenship. The thesis findings have implications for academics, policy-makers and practitioners in terms of developing better ways to understand disability constructs in disability income support policy. The thesis also makes a contribution in terms of promoting income support policies based on the rights of all people, not just a few.
Resumo:
Alterations in cognitive function are characteristic of the aging process in humans and other animals. However, the nature of these age related changes in cognition is complex and is likely to be influenced by interactions between genetic predispositions and environmental factors resulting in dynamic fluctuations within and between individuals. These inter and intra-individual fluctuations are evident in both so-called normal cognitive aging and at the onset of cognitive pathology. Mild Cognitive Impairment (MCI), thought to be a prodromal phase of dementia, represents perhaps the final opportunity to mitigate cognitive declines that may lead to terminal conditions such as dementia. The prognosis for people with MCI is mixed with the evidence suggesting that many will remain stable within 10-years of diagnosis, many will improve, and many will transition to dementia. If the characteristics of people who do not progress to dementia from MCI can be identified and replicated in others it may be possible to reduce or delay dementia onset, thus reducing a growing personal and public health burden. Furthermore, if MCI onset can be prevented or delayed, the burden of cognitive decline in aging populations worldwide may be reduced. A cognitive domain that is sensitive to the effects of advancing age, and declines in which have been shown to presage the onset of dementia in MCI patients, is executive function. Moreover, environmental factors such as diet and physical activity have been shown to affect performance on tests of executive function. For example, improvements in executive function have been demonstrated as a result of increased aerobic and anaerobic physical activity and, although the evidence is not as strong, findings from dietary interventions suggest certain nutrients may preserve or improve executive functions in old age. These encouraging findings have been demonstrated in older adults with MCI and their non-impaired peers. However, there are some gaps in the literature that need to be addressed. For example, little is known about the effect on cognition of an interaction between diet and physical activity. Both are important contributors to health and wellbeing, and a growing body of evidence attests to their importance in mental and cognitive health in aging individuals. Yet physical activity and diet are rarely considered together in the context of cognitive function. There is also little known about potential underlying biological mechanisms that might explain the physical activity/diet/cognition relationship. The first aim of this program of research was to examine the individual and interactive role of physical activity and diet, specifically long chain polyunsaturated fatty acid consumption(LCn3) as predictors of MCI status. The second aim is to examine executive function in MCI in the context of the individual and interactive effects of physical activity and LCn3.. A third aim was to explore the role of immune and endocrine system biomarkers as possible mediators in the relationship between LCn3, physical activity and cognition. Study 1a was a cross-sectional analysis of MCI status as a function of erythrocyte proportions of an interaction between physical activity and LCn3. The marine based LCn3s eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have both received support in the literature as having cognitive benefits, although comparisons of the relative benefits of EPA or DHA, particularly in relation to the aetiology of MCI, are rare. Furthermore, a limited amount of research has examined the cognitive benefits of physical activity in terms of MCI onset. No studies have examined the potential interactive benefits of physical activity and either EPA or DHA. Eighty-four male and female adults aged 65 to 87 years, 50 with MCI and 34 without, participated in Study 1a. A logistic binary regression was conducted with MCI status as a dependent variable, and the individual and interactive relationships between physical activity and either EPA or DHA as predictors. Physical activity was measured using a questionnaire and specific physical activity categories were weighted according to the metabolic equivalents (METs) of each activity to create a physical activity intensity index (PAI). A significant relationship was identified between MCI outcome and the interaction between the PAI and EPA; participants with a higher PAI and higher erythrocyte proportions of EPA were more likely to be classified as non-MCI than their less active peers with less EPA. Study 1b was a randomised control trial using the participants from Study 1a who were identified with MCI. Given the importance of executive function as a determinant of progression to more severe forms of cognitive impairment and dementia, Study 1b aimed to examine the individual and interactive effect of physical activity and supplementation with either EPA or DHA on executive function in a sample of older adults with MCI. Fifty male and female participants were randomly allocated to supplementation groups to receive 6-months of supplementation with EPA, or DHA, or linoleic acid (LA), a long chain polyunsaturated omega-6 fatty acid not known for its cognitive enhancing properties. Physical activity was measured using the PAI from Study 1a at baseline and follow-up. Executive function was measured using five tests thought to measure different executive function domains. Erythrocyte proportions of EPA and DHA were higher at follow-up; however, PAI was not significantly different. There was also a significant improvement in three of the five executive function tests at follow-up. However, regression analyses revealed that none of the variance in executive function at follow-up was predicted by EPA, DHA, PAI, the EPA by PAI interaction, or the DHA by PAI interaction. The absence of an effect may be due to a small sample resulting in limited power to find an effect, the lack of change in physical activity over time in terms of volume and/or intensity, or a combination of both reduced power and no change in physical activity. Study 2a was a cross-sectional study using cognitively unimpaired older adults to examine the individual and interactive effects of LCn3 and PAI on executive function. Several possible explanations for the absence of an effect were identified. From this consideration of alternative explanations it was hypothesised that post-onset interventions with LCn3 either alone or in interation with self-reported physical activity may not be beneficial in MCI. Thus executive function responses to the individual and interactive effects of physical activity and LCn3 were examined in a sample of older male and female adults without cognitive impairment (n = 50). A further aim of study 2a was to operationalise executive function using principal components analysis (PCA) of several executive function tests. This approach was used firstly as a data reduction technique to overcome the task impurity problem, and secondly to examine the executive function structure of the sample for evidence of de-differentiation. Two executive function components were identified as a result of the PCA (EF 1 and EF 2). However, EPA, DHA, the PAI, or the EPA by PAI or DHA by PAI interactions did not account for any variance in the executive function components in subsequent hierarchical multiple regressions. Study 2b was an exploratory correlational study designed to explore the possibility that immune and endocrine system biomarkers may act as mediators of the relationship between LCn3, PAI, the interaction between LCn3 and PAI, and executive functions. Insulin-like growth factor-1 (IGF-1), an endocrine system growth hormone, and interleukin-6 (IL-6) an immune system cytokine involved in the acute inflammatory response, have both been shown to affect cognition including executive functions. Moreover, IGF-1 and IL-6 have been shown to be antithetical in so far as chronically increased IL-6 has been associated with reduced IGF-1 levels, a relationship that has been linked to age related morbidity. Further, physical activity and LCn3 have been shown to modulate levels of both IGF-1 and IL-6. Thus, it is possible that the cognitive enhancing effects of LCn3, physical activity or their interaction are mediated by changes in the balance between IL-6 and IGF-1. Partial and non-parametric correlations were conducted in a subsample of participants from Study 2a (n = 13) to explore these relationships. Correlations of interest did not reach significance; however, the coefficients were quite large for several relationships suggesting studies with larger samples may be warranted. In summary, the current program of research found some evidence supporting an interaction between EPA, not DHA, and higher energy expenditure via physical activity in differentiating between older adults with and without MCI. However, a RCT examining executive function in older adults with MCI found no support for increasing EPA or DHA while maintaining current levels of energy expenditure. Furthermore, a cross-sectional study examining executive function in older adults without MCI found no support for better executive function performance as a function of increased EPA or DHA consumption, greater energy expenditure via physical activity or an interaction between physical activity and either EPA or DHA. Finally, an examination of endocrine and immune system biomarkers revealed promising relationships in terms of executive function in non-MCI older adults particularly with respect to LCn3 and physical activity. Taken together, these findings demonstrate a potential benefit of increasing physical activity and LCn3 consumption, particularly EPA, in mitigating the risk of developing MCI. In contrast, no support was found for a benefit to executive function as a result of increased physical activity, LCn3 consumption or an interaction between physical activity and LCn3, in participants with and without MCI. These results are discussed with reference to previous findings in the literature including possible limitations and opportunities for future research.
Resumo:
Background The Achenbach child behaviour checklist (CBCL/YSR) is a widely used screening tool for affective problems. Several studies report good association between the checklists and psychiatric diagnoses; although with varying degrees of agreement. Most are cross-sectional studies involving adolescents referred to mental health services. This paper aims to evaluate the performance of the youth self report (YSR) empirical and DSM-oriented internalising scales in predicting later depressive disorders in young adults. Methods Sample was 2431 young adults from an Australian birth cohort study. The strength of association between the empirical and DSM-oriented scales assessed at 14 and 21 years and structured-interview derived depression in young adulthood (18 to 22 years) were tested using odds ratios, ROC analyses and related diagnostic efficiency tests (sensitivity, specificity, positive and negative predictive values). Results Adolescents with internalising symptoms were twice (OR 2.3, 95%CI 1.7 to 3.1) as likely to be diagnosed with DSM-IV depression by age 21. Use of DSM-oriented depressive scales did not improve the concordance between the internalising behaviour and DSM-IV diagnosed depression at age 14 (ORs ranged from 1.9 to 2.5). Limitations Some loss to follow-up over the 7-year gap between the two waves of follow-up. Conclusion DSM-oriented scales perform no better than the standard internalising or anxious/depressed scales in identifying young adults with later DSM-IV depressive disorder.
Resumo:
Background The Achenbach problem behaviour scales (CBCL/YSR) are widely used. The DSM-oriented anxiety and depression scales have been created to improve concordance between Achenbach’s internalising scales and DSM-IV depression and anxiety. To date no study has examined the concurrent utility of the young adult (YASR) internalising scales, either the empirical or newly developed DSM-oriented depressive or anxiety scales. Methods A sample of 2,551 young adults, aged 18–23 years, from an Australian cohort study. The association between the empirical and DSM-oriented anxiety and depression scales were individually assessed against DSMIV depression and anxiety diagnoses derived from structured interview. Odds ratios, ROC analyses and diagnostic efficiency tests (sensitivity, specificity, positive and negative predictive values) were used to report findings. Results YASR empirical internalising scale predicted DSM-IV mood disorders (depression OR = 6.9, 95% CI 5.0–9.5; anxiety OR = 5.1, 95% CI 3.8–6.7) in the previous 12 months. DSM-oriented depressive or anxiety scales did not appear to improve the concordance with DSM-IV diagnosed depression or anxiety. The internalising scales were much more effective at identifying those with comorbid depression and anxiety, with Ors between 10.1 and 21.7 depending on the internalising scale used. Conclusion DSM-oriented scales perform no better than the standard internalising in identifying young adults with DSM-IV mood or anxiety disorder.
Resumo:
Recent welfare reform in Australia has been constructed around the now-familiar principle of paid work and willingness to work as the fundamental marker of social citizenship. Beginning with the long-term unemployed in Australia in the mid 1990s, the scope of welfare reform has now extended to include people with a disability – which is a category of income support that has been growing in Australia. From the national government’s point of view this growth is a financial concern as it seeks to move as many people as possible into paid work to support the costs of an ageing population (DEWR, 2005). In doing so, the government has changed the meaning of disability in terms of eligibility for financial support from the state, and at the same time redefined the role of people with a disability with regard to work, and the role of the state with regard to the disabled. This has been a matter of some political contention in Australia.
Access to commercial destinations within the neighbourhood and walking among Australian older adults
Resumo:
BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.
Resumo:
"The National Disability Insurance Scheme (NDIS) was launched on 1 July 2013. The NDIS Act 2013 is an historic piece of legislation that is the foundation for a national scheme which will deliver meaningful change for people with disabilities across Australia. The NDIS seeks to support the independence and social and economic participation of people with a disability, mainly by funding the provision of reasonable and necessary supports, including early intervention supports. The NDIS establishes three main criteria for access to the scheme - age, residence and disability. The National Disability Insurance Scheme Handbook written by Bill Madden, Janine McIlwraith and Ruanne Brell examines the NDIS from the viewpoint of a person seeking to access the NDIS and those advising or assisting them. The three key criteria are examined, along with the powers of the NDIS Chief Executive Officer and the scope for review of adverse decisions. The important area of interplay between the NDIS and compensation entitlements is carefully scrutinised. This handbook provides scheme users, carers, lawyers and health practitioners with an easy to understand guide to this watershed legal development."--Publisher website