821 resultados para SEVERITY OF ILLNESS INDEX
Resumo:
Background: High levels of distress and need for self-care information by patients commencing chemotherapy suggest that current prechemotherapy education is suboptimal. We conducted a randomised, controlled trial of a prechemotherapy education intervention (ChemoEd) to assess impact on patient distress, treatment-related concerns, and the prevalence and severity of and bother caused by six chemotherapy side-effects. Patients and methods: One hundred and ninety-two breast, gastrointestinal, and haematologic cancer patients were recruited before the trial closing prematurely (original target 352). ChemoEd patients received a DVD, question-prompt list, self-care information, an education consultation ≥24 h before first treatment (intervention 1), telephone follow-up 48 h after first treatment (intervention 2), and a face-to-face review immediately before second treatment (intervention 3). Patient outcomes were measured at baseline (T1: pre-education) and immediately preceding treatment cycles 1 (T2) and 3 (T3). Results: ChemoEd did not significantly reduce patient distress. However, a significant decrease in sensory/psychological (P = 0.027) and procedural (P = 0.03) concerns, as well as prevalence and severity of and bother due to vomiting (all P = 0.001), were observed at T3. In addition, subgroup analysis of patients with elevated distress at T1 indicated a significant decrease (P = 0.035) at T2 but not at T3 (P = 0.055) in ChemoEd patients. Conclusions: ChemoEd holds promise to improve patient treatment-related concerns and some physical/psychological outcomes; however, further research is required on more diverse patient populations to ensure generalisability.
Resumo:
Since March 2010 in Queensland, legislation has specified the type of restraint and seating row for child passengers under 7 years according to age. The following study explored regional parents’ child restraint practices and the influence of their health beliefs over these. A brief intercept interview was verbally administered to a convenience sample of parent-drivers (n = 123) in Toowoomba in February 2010, after the announcement of changes to legislation but prior to enforcement. Parents who agreed to be followed-up were then reinterviewed after the enforcement (May-June 2010). The Health Beliefs Model was used to gauge beliefs about susceptibility to crashing, children being injured in a crash, and likely severity of injuries. Self-efficacy and perceptions about barriers to, and benefits of, using age-appropriate restraints with children, were also assessed. Results: There were very high levels of rear seating reported for children (initial interview 91%; follow-up 100%). Dedicated child restraint use was 96.9% at initial interview, though 11% were deemed inappropriate for the child’s age. Self-reported restraint practices for children under 7 were used to categorise parental practices into ‘Appropriate’ (all children in age-appropriate restraint and rear seat) or ‘Inappropriate’ (≥1 child inappropriately restrained). 94% of parents were aware of the legislation, but only around one third gave accurate descriptions of the requirements. However, 89% of parents were deemed to have ‘Appropriate’ restraint practices. Parents with ‘Inappropriate’ practices were significantly more likely than those with ‘Appropriate’ practices to disagree that child restraints provide better protection for children in a crash than adult seatbelts. For self-efficacy, parents with ‘Appropriate’ practices were more likely than those with ‘Inappropriate’ practices to report being ‘completely confident’ about installing child restraints. The results suggest that efforts to increase the level of appropriate restraint should attempt to better inform them about the superior protection offered by child restraints compared with seat belts for children.
Resumo:
Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.
Resumo:
Current diagnostic methods for assessing the severity of articular cartilage degenerative conditions, such as osteoarthritis, are inadequate. There is also a lack of techniques that can be used for real-time evaluation of the tissue during surgery to inform treatment decision and eliminate subjectivity. This book, derived from Dr Afara’s doctoral research, presents a scientific framework that is based on near infrared (NIR) spectroscopy for facilitating the non-destructive evaluation of articular cartilage health relative to its structural, functional, and mechanical properties. This development is a component of the ongoing research on advanced endoscopic diagnostic techniques in the Articular Cartilage Biomechanics Research Laboratory of Professor Adekunle Oloyede at Queensland University of Technology (QUT), Brisbane Australia.
Resumo:
This study tested the hypothesis that negative symptoms and quality of life for patients with functional psychoses are associated with family environment. Fifty-seven first-admission patients with functional psychoses were assessed at hospital admission for severity of psychopathology and premorbid adjustment. Relatives residing with patients rated the family environment at admission and one month after discharge on the Family Environment Scale. Patients made the same ratings after discharge. Six months later, patients were reassessed on severity of psychopathology, negative symptoms, and quality of life. Multiple regression analyses showed that higher levels of positive emotional expressiveness in the family predicted milder and fewer negative symptoms and better quality of life at follow-up. The prediction was statistically independent of the initial severity of psychopathology or premorbid adjustment
Resumo:
An increasing body of research is highlighting the involvement of illicit drugs in many road fatalities. Deterrence theory has been a core conceptual framework underpinning traffic enforcement as well as interventions designed to reduce road fatalities. Essentially the effectiveness of deterrence-based approaches is predicated on perceptions of certainty, severity, and swiftness of apprehension. However, much less is known about how the awareness of legal sanctions can impact upon the effectiveness of deterrence mechanisms and whether promoting such detection methods can increase the deterrent effect. Nevertheless, the implicit assumption is that individuals aware of the legal sanctions will be more deterred. This study seeks to explore how awareness of the testing method impacts upon the effectiveness of deterrence-based interventions and intentions to drug drive again in the future. In total, 161 participants who reported drug driving in the previous six months took part in the current study. The results show that awareness of testing had a small effect upon increasing perceptions of the certainty of apprehension and severity of punishment. However, awareness was not a significant predictor of intentions to drug drive again in the future. Importantly, higher levels of drug use were a significant predictor of intentions to drug drive in the future. Whilst awareness does have a small effect on deterrence variables, the influence of levels of drug use seems to reduce any deterrent effect.
Resumo:
Background It has been proposed that the feral horse foot is a benchmark model for foot health in horses. However, the foot health of feral horses has not been formally investigated. Objectives To investigate the foot health of Australian feral horses and determine if foot health is affected by environmental factors, such as substrate properties and distance travelled. Methods Twenty adult feral horses from five populations (n = 100) were investigated. Populations were selected on the basis of substrate hardness and the amount of travel typical for the population. Feet were radiographed and photographed, and digital images were surveyed by two experienced assessors blinded to each other's assessment and to the population origin. Lamellar samples from 15 feet from three populations were investigated histologically for evidence of laminitis. Results There was a total of 377 gross foot abnormalities identified in 100 left forefeet. There were no abnormalities detected in three of the feet surveyed. Each population had a comparable prevalence of foot abnormalities, although the type and severity of abnormality varied among populations. Of the three populations surveyed by histopathology, the prevalence of chronic laminitis ranged between 40% and 93%. Conclusions Foot health appeared to be affected by the environment inhabited by the horses. The observed chronic laminitis may be attributable to either nutritional or traumatic causes. Given the overwhelming evidence of suboptimal foot health, it may not be appropriate for the feral horse foot to be the benchmark model for equine foot health.
Resumo:
Dear Editor We thank Dr Klek for his interest in our article and giving us the opportunity to clarify our study and share our thoughts. Our study looks at the prevalence of malnutrition in an acute tertiary hospital and tracked the outcomes prospectively.1 There are a number of reasons why we chose Subjective Global Assessment (SGA) to determine the nutritional status of patients. Firstly, we took the view that nutrition assessment tools should be used to determine nutrition status and diagnose presence and severity of malnutrition; whereas the purpose of nutrition screening tools are to identify individuals who are at risk of malnutrition. Nutritional assessment rather than screening should be used as the basis for planning and evaluating nutrition interventions for those diagnosed with malnutrition. Secondly, Subjective Global Assessment (SGA) has been well accepted and validated as an assessment tool to diagnose the presence and severity of malnutrition in clinical practice.2, 3 It has been used in many studies as a valid prognostic indicator of a range of nutritional and clinical outcomes.4, 5, 6 On the other hand, Malnutrition Universal Screening Tool (MUST)7 and Nutrition Risk Screening 2002 (NRS 2002)8 have been established as screening rather than assessment tools.
Resumo:
Background: Demand for pre-hospital emergency care is increasing in Australia as in many other countries. Using posthoc criteria such as triage, diagnosis and admission status, some authors view a considerable number of these as "inappropriate". Yet, calling an ambulance at the time of emergency is rarely studied from the patients’ or their carers’ perspective. This study interviewed patients about the decision, circumstances surrounding and reasons for calling an ambulance in Queensland, Australia. Methods: A cross-sectional survey of patients attending a sample of eight public hospital emergency departments in Queensland was undertaken between March and May 2011. In total, 911 questionnaires were collected (response rate: 67%), of whom 226 (24.8%) had arrived by ambulance. Results: In 35.6% of ambulance arrivals, the decision to request an ambulance was made by the patient; 25% by a doctor; 20% by a family member, friend or carer. Other callers included nurse, people at work or school, and passers-by. Reasons to request an ambulance included urgency (87%) and severity (84%) of the condition. Other reasons included requiring special care (76%), getting higher priority at the emergency department (34%), not having a car (34%), and financial concerns (17%). Decision to request an ambulance varied significantly according to the time of illness onset (e.g. on the day, week before), and location (e.g. home, outside). Conclusion: The decision to call an ambulance is made mostly by non-medical professionals in a perceived emergency situation. They call the ambulance for different reasons but mainly take into account the patient’s welfare and safety. Better understanding of these reasons will affect the direction and effectiveness of demand management strategies.
Resumo:
Context Patients with venous leg ulcers experience multiple symptoms, including pain, depression, and discomfort from lower leg inflammation and wound exudate. Some of these symptoms impair wound healing and decrease quality of life (QOL). The presence of co-occurring symptoms may have a negative effect on these outcomes. The identification of symptom clusters could potentially lead to improvements in symptom management and QOL. Objectives To identify the prevalence and severity of common symptoms and the occurrence of symptom clusters in patients with venous leg ulcers. Methods For this secondary analysis, data on sociodemographic characteristics, medical history, venous history, ulcer and lower limb clinical characteristics, symptoms, treatments, healing, and QOL were analyzed from a sample of 318 patients with venous leg ulcers who were recruited from hospital outpatient and community nursing clinics for leg ulcers. Exploratory factor analysis was used to identify symptom clusters. Results Almost two-thirds (64%) of the patients experienced four or more concurrent symptoms. The most frequent symptoms were sleep disturbance (80%), pain (74%), and lower limb swelling (67%). Sixty percent of patients reported three or more symptoms at a moderate-to-severe level of intensity (e.g., 78% reported disturbed sleep frequently or always; the mean pain severity score was 49 of 100, SD 26.5). Exploratory factor analysis identified two symptom clusters: pain, depression, sleep disturbance, and fatigue; and swelling, inflammation, exudate, and fatigue. Conclusion Two symptom clusters were identified in this sample of patients with venous leg ulcers. Further research is needed to verify these symptom clusters and to evaluate their effect on patient outcomes.
Resumo:
Climate change is leading to an increased frequency and severity of heat waves. Spells of several consecutive days of unusually high temperatures have led to increased mortality rates for the more vulnerable in the community. The problem is compounded by the escalating energy costs and increasing peak electrical demand as people become more reliant on air conditioning. Domestic air conditioning is the primary determinant of peak power demand which has been a major driver of higher electricity costs. This report presents the findings of multidisciplinary research which develops a national framework to evaluate the potential impacts of heat waves. It presents a technical, social and economic approach to adapt Australian residential buildings to ameliorate the impact of heat waves in the community and reduce the risk of its adverse outcomes. Through the development of a methodology for estimating the impact of global warming on key weather parameters in 2030 and 2050, it is possible to re-evaluate the size and anticipated energy consumption of air conditioners in future years for various climate zones in Australia. Over the coming decades it is likely that mainland Australia will require more cooling than heating. While in some parts the total electricity usage for heating and cooling may remain unchanged, there is an overall significant increase in peak electricity demand, likely to further drive electricity prices. Through monitoring groups of households in South Australia, New South Wales and Queensland, the impact of heat waves on both thermal comfort sensation and energy consumption for air conditioning has been evaluated. The results show that households are likely to be able to tolerate slightly increased temperature levels indoors during periods of high outside temperatures. The research identified that household electricity costs are likely to rise above what is currently projected due to the impact of climate change. Through a number of regulatory changes to both household design and air conditioners, this impact can be minimised. A number of proposed retrofit and design measures are provided, which can readily reduce electricity usage for cooling at minimal cost to the household. Using a number of social research instruments, it is evident that households are willing to change behaviour rather than to spend money. Those on lower income and elderly individuals are the least able to afford the use of air conditioning and should be a priority for interventions and assistance. Increasing community awareness of cost effective strategies to manage comfort and health during heat waves is a high priority recommended action. Overall, the research showed that a combined approach including behaviour change, dwelling modification and improved air conditioner selection can readily adapt Australian households to the impact of heat waves, reducing the risk of heat related deaths and household energy costs.
Resumo:
Freshwater prawn (Macrobrachium rosenbergii) culture in the Western Hemisphere is primarily, if not entirely, derived from 36 individual prawns originally introduced to Hawaii from Malaysia in 1965 and 1966. Little information is available regarding genetic variation within and among cultured prawn stocks worldwide. The goal of the current study was to characterize genetic diversity in various prawn populations with emphasis on those cultured in North America. Five microsatellite loci were screened to estimate genetic diversity in two wild (Myanmar and India-wild) and seven cultured (Hawaii-1, Hawaii-2, India-cultured, Israel, Kentucky, Mississippi and Texas) populations. Average allelic richness ranged from 3.96 (Israel) to 20.45 (Myanmar). Average expected heterozygosity ranged from 0.580 (Israel) to 0.935 (Myanmar). Many of the cultured populations exhibited reduced genetic diversity when compared with the Myanmar and the India-cultured populations. Significant deficiency in heterozygotes was detected in the India-cultured, Mississippi and Kentucky populations (overall Fis estimated of 0.053, 0.067 and 0.108 respectively) reflecting moderate levels of inbreeding. Overall estimate of fixation index (Fst = 0.1569) revealed moderately high levels of differentiation among the populations. Outcome of this study provide a baseline assessment of genetic diversity in some available strains that will be useful for the development of breeding programmes.
Resumo:
BACKGROUND: Migraine is a chronic disabling neurovascular condition that may in part be caused by endothelial and cerebrovascular disruption induced by hyperhomocysteinaemia. We have previously provided evidence indicating that reduction of homocysteine by vitamin supplementation can reduce the occurrence of migraine in women. The current study examined the genotypic effects of methylenetetrahydrofolate reductase (MTHFR) and methionine synthase reductase (MTRR) gene variants on the occurrence of migraine in response to vitamin supplementation. METHODS: This was a 6-month randomized, double-blinded placebo-controlled trial of daily vitamin B supplementation (B(6), B(9) and B(12)) on reduction of homocysteine and of the occurrence of migraine in 206 female patients diagnosed with migraine with aura. RESULTS: Vitamin supplementation significantly reduced homocysteine levels (P<0.001), severity of headache in migraine (P=0.017) and high migraine disability (P=0.022) in migraineurs compared with the placebo effect (P>0.1). When the vitamin-treated group was stratified by genotype, the C allele carriers of the MTHFR C677T variant showed a higher reduction in homocysteine levels (P<0.001), severity of pain in migraine (P=0.01) and percentage of high migraine disability (P=0.009) compared with those with the TT genotypes. Similarly, the A allele carriers of the MTRR A66G variants showed a higher level of reduction in homocysteine levels (P<0.001), severity of pain in migraine (P=0.002) and percentage of high migraine disability (P=0.006) compared with those with the GG genotypes. Genotypic analysis for both genes combined indicated that the treatment effect modification of the MTRR variant was independent of the MTHFR variant. CONCLUSION: This provided further evidence that vitamin supplementation is effective in reducing migraine and also that both MTHFR and MTRR gene variants are acting independently to influence treatment response in female migraineurs.
Resumo:
IL-17 is believed to be important for protection against extracellular pathogens, where clearance is dependent on neutrophil recruitment and local activation of epithelial cell defences. However, the role of IL-17 in protection against intracellular pathogens such as Chlamydia is less clear. We have compared (i) the course of natural genital tract C. muridarum infection, (ii) the development of oviduct pathology and (iii) the development of vaccine-induced immunity against infection in wild type (WT) BALB/c and IL-17 knockout mice (IL-17-/-) to determine if IL-17-mediated immunity is implicated in the development of infection-induced pathology and/or protection. Both the magnitude and duration of genital infection was significantly reduced in IL-17-/- mice compared to BALB/c. Similarly, hydrosalpinx was also greatly reduced in IL-17-/- mice and this correlated with reduced neutrophil and macrophage infiltration of oviduct tissues. Matrix metalloproteinase (MMP) 9 and MMP2 were increased in WT oviducts compared to IL-17-/- animals at day 7 post-infection. In contrast, oviducts from IL-17-/- mice contained higher MMP9 and MMP2 at day 21. Infection also elicited higher levels of Chlamydia-neutralizing antibody in serum of IL-17-/- mice than WT mice. Following intranasal immunization with C. muridarum Major Outer Membrane Protein (MOMP) and cholera toxin plus CpG adjuvants, significantly higher levels of chlamydial MOMP-specific IgG and IgA were found in serum and vaginal washes of IL-17-/- mice. T cell proliferation and IFNγ production by splenocytes was greater in WT animals following in vitro re-stimulation, however vaccination was only effective at reducing infection in WT, not IL-17-/- mice. Intranasal or transcutaneous immunization protected WT but not IL-17-/- mice against hydrosalpinx development. Our data show that in the absence of IL-17, the severity of C. muridarum genital infection and associated oviduct pathology are significantly attenuated, however neither infection or pathology can be reduced further by vaccination protocols that effectively protect WT mice.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.