848 resultados para goodwill impairment
Resumo:
The greater volume of businesses sold in Australia each year are small to medium enterprises. The administration of business contracts presents far different challenges than, for example, contracts for the sale of goods alone or contracts for the sale of land. The subject matter comprises both real and personal, and tangible and intangible property. Other considerations that do not affect those other commonplace contracts include dealing with employees who are both remaining and departing, taking account of restraints of trade, and the phenomena of the passing of property being different in respect of different forms of property being transferred in the same contract. In keeping with the format of the previous edition, the book is written with the busy practitioner in mind. It deals with the formation of business contracts, all aspects of disclosure both contractual and statutory, the role of agents, and detailed consideration of the different types of subject matter of small business contracts including, the lease of the premises, intellectual property, goodwill, licences, book debts and plant and equipment. It has up to date treatment of income tax implications of the sale and the impact of the latest Commonwealth legislation on dealing with employees of a business on sale. Consistent with the last edition, the book has chapters on time of the essence and completion, personal securities, restraint of trade clauses, special conditions and remedies for breach by both parties and misleading or deceptive conduct by the seller. In relation to personal securities, whilst the current State and Territory based law on Bills of Sale and other Chattel Securities has been the subject of commentary, the proposed national reform agenda has also been commented upon although that legislation is not due until May 2010 at the earliest
Resumo:
Irritable bowel syndrome (IBS) is a common chronic disorder with a prevalence ranging from 5 to 10 % of the world's population. This condition is characterised by abdominal discomfort or pain, altered bowel habits, and often bloating and abdominal distension. IBS reduces quality of life in the same degree of impairment as major chronic diseases such as congestive heart failure and diabetes and the economic burden on the health care system and society is high. Abnormalities have been reported in the neuroendocrine peptides/amines of the stomach, small- and large intestine in patients with IBS. These abnormalities would cause disturbances in digestion, gastrointestinal motility and visceral hypersensitivity, which have been reported in patients with IBS. These abnormalities seem to contribute to the symptom development and appear to play a central role in the pathogenesis of IBS. Neuroendocrine peptides/amines are potential tools in the treatment and diagnosis of IBS. In particular, the cell density of duodenal chromogranin A expressing cells appears to be a good histopathological marker for the diagnosis of IBS with high sensitivity and specificity.
Resumo:
The world of disability is often neglected or taken for granted in able-bodied society. Apart from the challenge that disability is a social construct (Linton, 1998, 2006; Longmore, 2003; Thompson, 1997) there is an impact on the people with disability that they either feel left out or they don’t belong in the larger community. The able-bodied community is also left with very little knowledge or no sensitivity towards people with disability. These internal whirlpools do not contribute to any community only to create larger gaps and higher differences between the groups of people. Peace (2010) claims that disability is something imposed on to a person on top of a physical impairment. Nord (2008) advocates that while environmental barriers and social attitudes are crucial aspects of a person’s experience, they can indeed disable a person. The study reported high-lights what is home for people with disability and their family members. The way the person with disability and family members without disability share the same home and nurture personal relationships with each other demands greater attention. This research sheds light on the intricate relationships that exists between the family members including person with disability and their built environment. These existential connections provide a holistic viewpoint and the glimpse into the lived experiences of homes for people with disability and their care-givers. Concepts of universal design or barrier free design have not been successful (Connell and Sanford, 1999) in revealing in-depth the nature of place-making for people with disability and their care-givers. Such studies fail to incorporate the holistic needs of individuals with disability and their family members in terms of their bodily, visceral, emotional, social, psycho-social, intuitive, spiritual and temporal needs, to name a few (Franz, Bitner, 2010). This paper reports on some preliminary findings on phenomena of dwelling for people with different kinds of disability and their care-givers sharing the same home from an interior design perspective.
Resumo:
Whilst the debilitating fatigue experienced in patients suffering from Chronic Fatigue Syndrome (CFS) results in a subjective marked impairment in functioning, little research has investigated the impact of this disorder on quality of life. Forty-seven subjects with a confirmed diagnosis of CFS and 30 healthy controls were compared using the Sickness Impact Profile (SIP). A subgroup of subjects were interviewed regarding the impact CFS has had on their social and family relationships, work and recreational activities. Results from both the SIP and the interview revealed that CFS subjects had significantly impaired quality of life, especially in areas of social functioning. These findings highlight the importance of addressing the social isolation and loss of role functioning experienced by CFS sufferers.
Resumo:
Background: Anecdotal evidence from the infrastructure and building sectors highlights issues of alcohol and other drugs (AODs) and its association with safety risk on construction sites. Currently, there is no clear evidence on the prevalence and risk of AOD use among Australian construction workers and there is limited evidential guidance regarding how to effectively address such an issue. Aims: The current research aims to scientifically evaluate the use of AODs within the Australian construction industry in order to reduce the potential resulting safety and performance impacts and engender a cultural change in the workforce. A nationally consistent and collaborative approach across the workforce will be adopted. Methods: A national assessment of the use of AODs was conducted in participating organisations across three states. The World Health Organisation’s Alcohol Use Disorders Identification Test (AUDIT) was used to measure alcohol use. Illicit drug use, ‘readiness to change’, impediments to reducing impairment, feasibility of proposed interventions, and employee attitudes and knowledge regarding AOD was also measured through a combination of survey items and interviews. Through an educative approach and consultation with employers, employees, union groups and leaders in applied AOD research, this assessment was used to inform and support cultural change management of AOD use in the industry. Results: Results (n=494) indicate that as in the general population, a proportion of those sampled in the construction sector may be at risk of hazardous alcohol consumption. A total of 286 respondents (58%) scored above the cut-off cumulative score for risky or hazardous alcohol use. Other drug use was also identified as a major issue. Interview responses and input from all project partners is presented within a guiding principle framework for cultural change. Conclusions: Results support the need for evidence-based, comprehensive and tailored responses in the workplace. This paper will discuss the final results in the context of facilitating cultural change in the construction industry.
Resumo:
Alterations in cognitive function are characteristic of the aging process in humans and other animals. However, the nature of these age related changes in cognition is complex and is likely to be influenced by interactions between genetic predispositions and environmental factors resulting in dynamic fluctuations within and between individuals. These inter and intra-individual fluctuations are evident in both so-called normal cognitive aging and at the onset of cognitive pathology. Mild Cognitive Impairment (MCI), thought to be a prodromal phase of dementia, represents perhaps the final opportunity to mitigate cognitive declines that may lead to terminal conditions such as dementia. The prognosis for people with MCI is mixed with the evidence suggesting that many will remain stable within 10-years of diagnosis, many will improve, and many will transition to dementia. If the characteristics of people who do not progress to dementia from MCI can be identified and replicated in others it may be possible to reduce or delay dementia onset, thus reducing a growing personal and public health burden. Furthermore, if MCI onset can be prevented or delayed, the burden of cognitive decline in aging populations worldwide may be reduced. A cognitive domain that is sensitive to the effects of advancing age, and declines in which have been shown to presage the onset of dementia in MCI patients, is executive function. Moreover, environmental factors such as diet and physical activity have been shown to affect performance on tests of executive function. For example, improvements in executive function have been demonstrated as a result of increased aerobic and anaerobic physical activity and, although the evidence is not as strong, findings from dietary interventions suggest certain nutrients may preserve or improve executive functions in old age. These encouraging findings have been demonstrated in older adults with MCI and their non-impaired peers. However, there are some gaps in the literature that need to be addressed. For example, little is known about the effect on cognition of an interaction between diet and physical activity. Both are important contributors to health and wellbeing, and a growing body of evidence attests to their importance in mental and cognitive health in aging individuals. Yet physical activity and diet are rarely considered together in the context of cognitive function. There is also little known about potential underlying biological mechanisms that might explain the physical activity/diet/cognition relationship. The first aim of this program of research was to examine the individual and interactive role of physical activity and diet, specifically long chain polyunsaturated fatty acid consumption(LCn3) as predictors of MCI status. The second aim is to examine executive function in MCI in the context of the individual and interactive effects of physical activity and LCn3.. A third aim was to explore the role of immune and endocrine system biomarkers as possible mediators in the relationship between LCn3, physical activity and cognition. Study 1a was a cross-sectional analysis of MCI status as a function of erythrocyte proportions of an interaction between physical activity and LCn3. The marine based LCn3s eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have both received support in the literature as having cognitive benefits, although comparisons of the relative benefits of EPA or DHA, particularly in relation to the aetiology of MCI, are rare. Furthermore, a limited amount of research has examined the cognitive benefits of physical activity in terms of MCI onset. No studies have examined the potential interactive benefits of physical activity and either EPA or DHA. Eighty-four male and female adults aged 65 to 87 years, 50 with MCI and 34 without, participated in Study 1a. A logistic binary regression was conducted with MCI status as a dependent variable, and the individual and interactive relationships between physical activity and either EPA or DHA as predictors. Physical activity was measured using a questionnaire and specific physical activity categories were weighted according to the metabolic equivalents (METs) of each activity to create a physical activity intensity index (PAI). A significant relationship was identified between MCI outcome and the interaction between the PAI and EPA; participants with a higher PAI and higher erythrocyte proportions of EPA were more likely to be classified as non-MCI than their less active peers with less EPA. Study 1b was a randomised control trial using the participants from Study 1a who were identified with MCI. Given the importance of executive function as a determinant of progression to more severe forms of cognitive impairment and dementia, Study 1b aimed to examine the individual and interactive effect of physical activity and supplementation with either EPA or DHA on executive function in a sample of older adults with MCI. Fifty male and female participants were randomly allocated to supplementation groups to receive 6-months of supplementation with EPA, or DHA, or linoleic acid (LA), a long chain polyunsaturated omega-6 fatty acid not known for its cognitive enhancing properties. Physical activity was measured using the PAI from Study 1a at baseline and follow-up. Executive function was measured using five tests thought to measure different executive function domains. Erythrocyte proportions of EPA and DHA were higher at follow-up; however, PAI was not significantly different. There was also a significant improvement in three of the five executive function tests at follow-up. However, regression analyses revealed that none of the variance in executive function at follow-up was predicted by EPA, DHA, PAI, the EPA by PAI interaction, or the DHA by PAI interaction. The absence of an effect may be due to a small sample resulting in limited power to find an effect, the lack of change in physical activity over time in terms of volume and/or intensity, or a combination of both reduced power and no change in physical activity. Study 2a was a cross-sectional study using cognitively unimpaired older adults to examine the individual and interactive effects of LCn3 and PAI on executive function. Several possible explanations for the absence of an effect were identified. From this consideration of alternative explanations it was hypothesised that post-onset interventions with LCn3 either alone or in interation with self-reported physical activity may not be beneficial in MCI. Thus executive function responses to the individual and interactive effects of physical activity and LCn3 were examined in a sample of older male and female adults without cognitive impairment (n = 50). A further aim of study 2a was to operationalise executive function using principal components analysis (PCA) of several executive function tests. This approach was used firstly as a data reduction technique to overcome the task impurity problem, and secondly to examine the executive function structure of the sample for evidence of de-differentiation. Two executive function components were identified as a result of the PCA (EF 1 and EF 2). However, EPA, DHA, the PAI, or the EPA by PAI or DHA by PAI interactions did not account for any variance in the executive function components in subsequent hierarchical multiple regressions. Study 2b was an exploratory correlational study designed to explore the possibility that immune and endocrine system biomarkers may act as mediators of the relationship between LCn3, PAI, the interaction between LCn3 and PAI, and executive functions. Insulin-like growth factor-1 (IGF-1), an endocrine system growth hormone, and interleukin-6 (IL-6) an immune system cytokine involved in the acute inflammatory response, have both been shown to affect cognition including executive functions. Moreover, IGF-1 and IL-6 have been shown to be antithetical in so far as chronically increased IL-6 has been associated with reduced IGF-1 levels, a relationship that has been linked to age related morbidity. Further, physical activity and LCn3 have been shown to modulate levels of both IGF-1 and IL-6. Thus, it is possible that the cognitive enhancing effects of LCn3, physical activity or their interaction are mediated by changes in the balance between IL-6 and IGF-1. Partial and non-parametric correlations were conducted in a subsample of participants from Study 2a (n = 13) to explore these relationships. Correlations of interest did not reach significance; however, the coefficients were quite large for several relationships suggesting studies with larger samples may be warranted. In summary, the current program of research found some evidence supporting an interaction between EPA, not DHA, and higher energy expenditure via physical activity in differentiating between older adults with and without MCI. However, a RCT examining executive function in older adults with MCI found no support for increasing EPA or DHA while maintaining current levels of energy expenditure. Furthermore, a cross-sectional study examining executive function in older adults without MCI found no support for better executive function performance as a function of increased EPA or DHA consumption, greater energy expenditure via physical activity or an interaction between physical activity and either EPA or DHA. Finally, an examination of endocrine and immune system biomarkers revealed promising relationships in terms of executive function in non-MCI older adults particularly with respect to LCn3 and physical activity. Taken together, these findings demonstrate a potential benefit of increasing physical activity and LCn3 consumption, particularly EPA, in mitigating the risk of developing MCI. In contrast, no support was found for a benefit to executive function as a result of increased physical activity, LCn3 consumption or an interaction between physical activity and LCn3, in participants with and without MCI. These results are discussed with reference to previous findings in the literature including possible limitations and opportunities for future research.
Resumo:
There is increasing concern about the impact of employees’ alcohol and other drug (AOD) consumption on workplace safety, particularly within the construction industry. No known study has scientifically evaluated the relationship between the use of drugs and alcohol and safety impacts in construction, and there has been only limited adoption of nationally coordinated strategies, supported by employers and employees to render it socially unacceptable to arrive at a construction workplace with impaired judgment from AODs. This research aims to scientifically evaluate the use of AODs within the Australian construction industry in order to reduce the potential resulting safety and performance impacts and engender a cultural change in the workforce. Using the Alcohol Use Disorders Identification Test (AUDIT), the study will adopt both quantitative and qualitative methods to evaluate the extent of general AOD use in the industry. Results indicate that a proportion of the construction sector may be at risk of hazardous alcohol consumption. A total of 286 respondents (58%) scored above the cut-off score for risky alcohol use with 43 respondents (15%) scoring in the significantly ‘at risk’ category. Other drug use was also identified as a major issue that must be addressed. Results support the need for evidence-based, preventative educational initiatives that are tailored specifically to the construction industry.
Resumo:
Aim The objective is to establish determinants of drink-driving and its association with traffic crashes in Ghana. Methods A multivariable logistic regression was used to establish significant determinants of drink-driving and a bivariate logistic regression to establish the association between drink–driving and road traffic crashes in Ghana. Results In total, 2,736 motorists were randomly stopped for breath testing of whom 8.7% tested positive for alcohol. Among the total participants, 5.5% exceeded the legal BAC limit of 0.08%. Formal education is associated with a reduced likelihood of drink-driving compared with drivers without formal education. The propensity to drink-drive is 1.8 times higher among illiterate drivers compared with drivers with basic education. Young adult drivers also recorded elevated likelihoods for driving under alcohol impairment compared with adult drivers. The odds of drink-driving among truck drivers is OR=1.81, (95% CI=1.16 to 2.82) and two wheeler riders is OR=1.41, (95% CI=0.47 to 4.28) compared with car drivers. Contrary to general perception, commercial car drivers have a significant reduced likelihood of 41%, OR=0.59, (95% CI=0.38 to 0.92) compared with the private car driver. Bivariate analysis conducted showed a significant association between the proportion of drivers exceeding the legal BAC limit and road traffic fatalities, p<0.001. The model predicts a 1% increase in the proportion of drivers exceeding the legal BAC to be associated with a 4% increase in road traffic fatalities, 95% CI= 3% to 5% and vice versa. Conclusion A positive and significant association between roadside alcohol prevalence and road traffic fatality has been established. Scaling up roadside breath test, determining standard drink and disseminating to the populace and formulating policies targeting the youth such as increasing minimum legal drinking age and reduced legal BAC limit for the youth and novice drivers might improve drink-driving related crashes in Ghana.
Resumo:
The article presents a study which investigated the reasons why advice related to the removal of mats or rags by older people with visual impairments had a low rate of acceptance. The researchers speculated that it may have been due to older people's need to maintain a sense of control and autonomy and to arrange their environments in a way that they decided or a belief that the recommended modification would not reduce the risk of falling. A telephone survey of subsample of the participants was conducted in the Visually Impaired Persons (VIP) Trial. All 30 interviewees had rugs or mats in their homes. Of the 30 participants, 20 had moved the rugs or mats as a result of recommendations, and 10 had not.
Resumo:
Multiple sclerosis (MS) is a complex autoimmune disorder of the CNS with both genetic and environmental contributing factors. Clinical symptoms are broadly characterized by initial onset, and progressive debilitating neurological impairment. In this study, RNA from MS chronic active and MS acute lesions was extracted, and compared with patient matched normal white matter by fluorescent cDNA microarray hybridization analysis. This resulted in the identification of 139 genes that were differentially regulated in MS plaque tissue compared to normal tissue. Of these, 69 genes showed a common pattern of expression in the chronic active and acute plaque tissues investigated (Pvalue<0.0001, ρ=0.73, by Spearman's ρ analysis); while 70 transcripts were uniquely differentially expressed (≥1.5-fold) in either acute or chronic active tissues. These results included known markers of MS such as the myelin basic protein (MBP) and glutathione S-transferase (GST) M1, nerve growth factors, such as nerve injury-induced protein 1 (NINJ1), X-ray and excision DNA repair factors (XRCC9 and ERCC5) and X-linked genes such as the ribosomal protein, RPS4X. Primers were then designed for seven array-selected genes, including transferrin (TF), superoxide dismutase 1 (SOD1), glutathione peroxidase 1 (GPX1), GSTP1, crystallin, alpha-B (CRYAB), phosphomannomutase 1 (PMM1) and tubulin β-5 (TBB5), and real time quantitative (Q)-PCR analysis was performed. The results of comparative Q-PCR analysis correlated significantly with those obtained by array analysis (r=0.75, Pvalue<0.01, by Pearson's bivariate correlation). Both chronic active and acute plaques shared the majority of factors identified suggesting that quantitative, rather than gross qualitative differences in gene expression pattern may define the progression from acute to chronic active plaques in MS.
Resumo:
Purpose: Over 40% of the permanent population of Norfolk Island possesses a unique genetic admixture dating to Pitcairn Island in the late 18 th century, with descendents having varying degrees of combined Polynesian and European ancestry. We conducted a population-based study to determine the prevalence and causes of blindness and low vision on Norfolk Island. Methods: All permanent residents of Norfolk Island aged ≥ 15 years were invited to participate. Participants completed a structured questionnaire/interview and underwent a comprehensive ophthalmic examination including slit-lamp biomicroscopy. Results: We recruited 781 people aged ≥ 15, equal to 62% of the permanent population, 44% of whom could trace their ancestry to Pitcairn Island. No one was bilaterally blind. Prevalence of unilateral blindness (visual acuity [VA] < 6/60) in those aged ≥ 40 was 1.5%. Blindness was more common in females (P=0.049) and less common in people with Pitcairn Island ancestry (P<0.001). The most common causes of unilateral blindness were age-related macular degeneration (AMD), amblyopia, and glaucoma. Five people had low vision (Best-Corrected VA < 6/18 in better eye), with 4 (80%) due to AMD. People with Pitcairn Island ancestry had a lower prevalence of AMD (P<0.001) but a similar prevalence of glaucoma to those without Pitcairn Island ancestry. Conclusions: The prevalence of blindness and visual impairment in this isolated Australian territory is low, especially amongst those with Pitcairn Island ancestry. AMD was the most common cause of unilateral blindness and low vision. The distribution of chronic ocular diseases on Norfolk Island is similar to mainland Australian estimates.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
The appropriateness of applying drink driving legislation to motorcycle riding has been questioned as there may be fundamental differences in the effects of alcohol on these two activities. For example, while the distribution of blood alcohol content (BAC) levels among fatally injured male drivers compared to riders is similar, a greater proportion of motorcycle fatalities involve levels in the lower (0 to .10% BAC) range. Several psychomotor and higher-order cognitive skills underpinning riding performance appear to be significantly influenced by low levels of alcohol. For example, at low levels (.02 to .046% BAC), riders show significant increases in reaction time to hazardous stimuli, inattention to the riding task, performance errors such as leaving the roadway and a reduced ability to complete a timed course. It has been suggested that alcohol may redirect riders’ focus from higher-order cognitive skills to more physical skills such as maintaining balance. As part of a research program to investigate the potential benefits of introducing a zero, or reduced, BAC for all riders in Queensland regardless of their licence status, the effects of low doses of alcohol on balance ability were investigated in a laboratory setting. The static balance of ten experienced riders was measured while they performed either no secondary task, a visual search task, or a cognitive (arithmetic) task following the administration of alcohol (0; 0.02, and 0.05% BAC). Subjective ratings of intoxication and balance impairment increased in a dose-dependent manner; however, objective measures of static balance were negatively affected only at the .05% BAC dose. Performance on a concurrent secondary visual search task, but not a purely cognitive (arithmetic) task, improved postural stability across all BAC levels. Finally, the .05% BAC dose was associated with impaired performance on the cognitive (arithmetic) task, but not the visual search task, when participants were balancing, but neither task was impaired by alcohol when participants were standing on the floor. Implications for road safety and future ‘drink riding’ policy considerations are discussed.
Resumo:
The use of mobile phones while driving is more prevalent among young drivers—a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q Advanced Driving Simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver’s peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21 to 26 years old and split evenly by gender. Drivers’ reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver’s age, license type (Provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated.
Resumo:
BACKGROUND: The prevalence of protein-energy malnutrition in older adults is reported to be as high as 60% and is associated with poor health outcomes. Inadequate feeding assistance and mealtime interruptions may contribute to malnutrition and poor nutritional intake during hospitalisation. Despite being widely implemented in practice in the United Kingdom and increasingly in Australia, there have been few studies examining the impact of strategies such as Protected Mealtimes and dedicated feeding assistant roles on nutritional outcomes of elderly inpatients. AIMS: The aim of this research was to implement and compare three system-level interventions designed to specifically address mealtime barriers and improve energy intakes of medical inpatients aged ≥65 years. This research also aimed to evaluate the sustainability of any changes to mealtime routines six months post-intervention and to gain an understanding of staff perceptions of the post-intervention mealtime experience. METHODS: Three mealtime assistance interventions were implemented in three medical wards at Royal Brisbane and Women's Hospital: AIN-only: Additional assistant-in-nursing (AIN) with dedicated nutrition role. PM-only: Multidisciplinary approach to meals, including Protected Mealtimes. PM+AIN: Combined intervention: AIN + multidisciplinary approach to meals. An action research approach was used to carefully design and implement the three interventions in partnership with ward staff and managers. Significant time was spent in consultation with staff throughout the implementation period to facilitate ownership of the interventions and increase likelihood of successful implementation. A pre-post design was used to compare the implementation and nutritional outcomes of each intervention to a pre-intervention group. Using the same wards, eligible participants (medical inpatients aged ≥65 years) were recruited to the preintervention group between November 2007 and March 2008 and to the intervention groups between January and June 2009. The primary nutritional outcome was daily energy and protein intake, which was determined by visually estimating plate waste at each meal and mid-meal on Day 4 of admission. Energy and protein intakes were compared between the pre and post intervention groups. Data were collected on a range of covariates (demographics, nutritional status and known risk factors for poor food intake), which allowed for multivariate analysis of the impact of the interventions on nutritional intake. The provision of mealtime assistance to participants and activities of ward staff (including mealtime interruptions) were observed in the pre-intervention and intervention groups, with staff observations repeated six months post-intervention. Focus groups were conducted with nursing and allied health staff in June 2009 to explore their attitudes and behaviours in response to the three mealtime interventions. These focus group discussions were analysed using thematic analysis. RESULTS: A total of 254 participants were recruited to the study (pre-intervention: n=115, AIN-only: n=58, PM-only: n=39, PM+AIN: n=42). Participants had a mean age of 80 years (SD 8), and 40% (n=101) were malnourished on hospital admission, 50% (n=108) had anorexia and 38% (n=97) required some assistance at mealtimes. Occasions of mealtime assistance significantly increased in all interventions (p<0.01). However, no change was seen in mealtime interruptions. No significant difference was seen in mean total energy and protein intake between the preintervention and intervention groups. However, when total kilojoule intake was compared with estimated requirements at the individual level, participants in the intervention groups were more likely to achieve adequate energy intake (OR=3.4, p=0.01), with no difference noted between interventions (p=0.29). Despite small improvements in nutritional adequacy, the majority of participants in the intervention groups (76%, n=103) had inadequate energy intakes to meet their estimated energy requirements. Patients with cognitive impairment or feeding dependency appeared to gain substantial benefit from mealtime assistance interventions. The increase in occasions of mealtime assistance by nursing staff during the intervention period was maintained six-months post-intervention. Staff focus groups highlighted the importance of clearly designating and defining mealtime responsibilities in order to provide adequate mealtime care. While the purpose of the dedicated feeding assistant was to increase levels of mealtime assistance, staff indicated that responsibility for mealtime duties may have merely shifted from nursing staff to the assistant. Implementing the multidisciplinary interventions empowered nursing staff to "protect" the mealtime from external interruptions, but further work is required to empower nurses to prioritise mealtime activities within their own work schedules. Staff reported an increase in the profile of nutritional care on all wards, with additional non-nutritional benefits noted including improved mobility and functional independence, and better identification of swallowing difficulties. IMPLICATIONS: The PhD research provides clinicians with practical strategies to immediately introduce change to deliver better mealtime care in the hospital setting, and, as such, has initiated local and state-wide roll-out of mealtime assistance programs. Improved nutritional intakes of elderly inpatients was observed; however given the modest effect size and reducing lengths of hospital stays, better nutritional outcomes may be achieved by targeting the hospital-to-home transition period. Findings from this study suggest that mealtime assistance interventions for elderly inpatients with cognitive impairment and/or functional dependency show promise.