629 resultados para nutritional factors
em Queensland University of Technology - ePrints Archive
Resumo:
The authors used data collected from 1995 to 1999, from an on-going cancer case–control study in greater Johannesburg, to estimate the importance of tobacco and alcohol consumption and other suspected risk factors with respect to cancer of the oesophagus (267 men and 138 women), lung (105 men and 41 women), oral cavity (87 men and 37 women), and larynx (51 men). Cancers not associated with tobacco or alcohol consumption were used as controls (804 men and 1370 women). Tobacco smoking was found to be the major risk factor for all of these cancers with odds ratios ranging from 2.6 (95% CI 1.5–4.5) for oesophageal cancer in female ex-smokers to 50.9 (95% CI 12.6–204.6) for lung cancer in women, and 23.9 (95% CI 9.5–60.3) for lung cancer and 23.6 (95% CI 4.6–121.2) for laryngeal cancer in men who smoked 15 or more grams of tobacco a day. This is the first time an association between smoking and oral and laryngeal cancers has been shown in sub-Saharan Africa. Long-term residence in the Transkei region in the southeast of the country continues to be a risk factor for oesophageal cancer, especially in women (odds ratio=14.7, 95% CI 4.7–46.0), possibly due to nutritional factors. There was a slight increase in lung cancer (odds ratio=2.9, 95% CI 1.1–7.5) in men working in ‘potentially noxious’ industries. ‘Frequent’ alcohol consumption, on its own, caused a marginally elevated risk for oesophageal cancer (odds ratio=1.7, 95% CI 1.0–2.9, for women and odds ratio=1.8, 95% CI 1.2–2.8, for men). The risks for oesophageal cancer in relation to alcohol consumption increased significantly in male and female smokers (odds ratio=4.7, 95% CI=2.8–7.9 in males and odds ratio=4.8, 95% CI 3.2–6.1 in females). The above results are broadly in line with international findings.
Resumo:
Objective: To document change in prevalence of obesity, diabetes and other cardiovascular diease (CVD) risk factors, and trends in dietary macronutrient intake, over an eight-year period in a rural Aboriginal community in central Australia. Design: Sequential cross-sectional community surveys in 1987, 1991 and 1995. Subjects: All adults (15 years and over) in the community were invited to participate. In 1987, 1991 and 1995, 335 (87% of eligible adults), 331 (76%) and 304 (68%), respectively, were surveyed. Main outcome measures: Body mass index and waist : hip ratio; blood glucose level and glucose tolerance; fasting total and high density lipoprotein (HDL) cholesterol and triglyceride levels; and apparent dietary intake (estimated by the store turnover method). Intervention: A community-based nutrition awareness and healthy lifestyle program, 1988-1990. Results: At the eight-year follow-up, the odds ratios (95% CIs) for CVD risk factors relative to baseline were obesity, 1.84 (1.28-2.66); diabetes, 1.83 (1.11-3.03); hypercholesterolaemia, 0.29 (0.20-0.42); and dyslipidaemia (high triglyceride plus low HDL cholesterol level), 4.54 (2.84-7.29). In younger women (15-24 years), there was a trebling in obesity prevalence and a four- to fivefold increase in diabetes prevalence. Store turnover data suggested a relative reduction in the consumption of refined carbohydrates and saturated fats. Conclusion: Interventions targeting nutritional factors alone are unlikely to greatly alter trends towards increasing prevalences of obesity and diabetes. In communities where healthy food choices are limited, the role of regular physical activity in improving metabolic fitness may also need to be emphasised.
Resumo:
BACKGROUND: Malnutrition, and poor intake during hospitalisation, are common in older medical patients. Better understanding of patient-specific factors associated with poor intake may inform nutritional interventions. AIMS: To measure the proportion of older medical patients with inadequate nutritional intake, and identify patient-related factors associated with this outcome. METHODS: Prospective cohort study enrolling consecutive consenting medical inpatients aged 65 years or older. Primary outcome was energy intake less than resting energy expenditure estimated using weight-based equations. Energy intake was calculated for a single day using direct observation of plate waste. Explanatory variables included age, gender, number of co-morbidities, number of medications, diagnosis, usual residence, nutritional status, functional and cognitive impairment, depressive symptoms, poor appetite, poor dentition, and dysphagia. RESULTS: Of 134 participants (mean age 80 years, 51% female), only 41% met estimated resting energy requirements. Mean energy intake was 1220 kcal/day (SD 440), or 18.1 kcal/kg/day. Factors associated with inadequate energy intake in multivariate analysis were poor appetite, higher BMI, diagnosis of infection or cancer, delirium and need for assistance with feeding. CONCLUSIONS: Inadequate nutritional intake is common, and patient factors contributing to poor intake need to be considered in nutritional interventions.
Resumo:
Background and significance: Older adults with chronic diseases are at increasing risk of hospital admission and readmission. Approximately 75% of adults have at least one chronic condition, and the odds of developing a chronic condition increases with age. Chronic diseases consume about 70% of the total Australian health expenditure, and about 59% of hospital events for chronic conditions are potentially preventable. These figures have brought to light the importance of the management of chronic disease among the growing older population. Many studies have endeavoured to develop effective chronic disease management programs by applying social cognitive theory. However, limited studies have focused on chronic disease self-management in older adults at high risk of hospital readmission. Moreover, although the majority of studies have covered wide and valuable outcome measures, there is scant evidence on examining the fundamental health outcomes such as nutritional status, functional status and health-related quality of life. Aim: The aim of this research was to test social cognitive theory in relation to self-efficacy in managing chronic disease and three health outcomes, namely nutritional status, functional status, and health-related quality of life, in older adults at high risk of hospital readmission. Methods: A cross-sectional study design was employed for this research. Three studies were undertaken. Study One examined the nutritional status and validation of a nutritional screening tool; Study Two explored the relationships between participants. characteristics, self-efficacy beliefs, and health outcomes based on the study.s hypothesized model; Study Three tested a theoretical model based on social cognitive theory, which examines potential mechanisms of the mediation effects of social support and self-efficacy beliefs. One hundred and fifty-seven patients aged 65 years and older with a medical admission and at least one risk factor for readmission were recruited. Data were collected from medical records on demographics, medical history, and from self-report questionnaires. The nutrition data were collected by two registered nurses. For Study One, a contingency table and the kappa statistic was used to determine the validity of the Malnutrition Screening Tool. In Study Two, standard multiple regression, hierarchical multiple regression and logistic regression were undertaken to determine the significant influential predictors for the three health outcome measures. For Study Three, a structural equation modelling approach was taken to test the hypothesized self-efficacy model. Results: The findings of Study One suggested that a high prevalence of malnutrition continues to be a concern in older adults as the prevalence of malnutrition was 20.6% according to the Subjective Global Assessment. Additionally, the findings confirmed that the Malnutrition Screening Tool is a valid nutritional screening tool for hospitalized older adults at risk of readmission when compared to the Subjective Global Assessment with high sensitivity (94%), and specificity (89%) and substantial agreement between these two methods (k = .74, p < .001; 95% CI .62-.86). Analysis data for Study Two found that depressive symptoms and perceived social support were the two strongest influential factors for self-efficacy in managing chronic disease in a hierarchical multiple regression. Results of multivariable regression models suggested advancing age, depressive symptoms and less tangible support were three important predictors for malnutrition. In terms of functional status, a standard regression model found that social support was the strongest predictor for the Instrumental Activities of Daily Living, followed by self-efficacy in managing chronic disease. The results of standard multiple regression revealed that the number of hospital readmission risk factors adversely affected the physical component score, while depressive symptoms and self-efficacy beliefs were two significant predictors for the mental component score. In Study Three, the results of the structural equation modelling found that self-efficacy partially mediated the effect of health characteristics and depression on health-related quality of life. The health characteristics had strong direct effects on functional status and body mass index. The results also indicated that social support partially mediated the relationship between health characteristics and functional status. With regard to the joint effects of social support and self-efficacy, social support fully mediated the effect of health characteristics on self-efficacy, and self-efficacy partially mediated the effect of social support on functional status and health-related quality of life. The results also demonstrated that the models fitted the data well with relative high variance explained by the models, implying the hypothesized constructs under discussion were highly relevant, and hence the application for social cognitive theory in this context was supported. Conclusion: This thesis highlights the applicability of social cognitive theory on chronic disease self-management in older adults at risk of hospital readmission. Further studies are recommended to validate and continue to extend the development of social cognitive theory on chronic disease self-management in older adults to improve their nutritional and functional status, and health-related quality of life.
Resumo:
In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.
Resumo:
Background Undernutrition, weight loss and dehydration are major clinical issues for people with dementia in residential care, with excessive weight loss contributing to increased risk of frailty, immobility, illness and premature morbidity. This paper discusses a nutritional knowledge and attitudes survey conducted as part of a larger project focused on improving nutritional intake of people with dementia within a residential care facility in Brisbane, Australia. Aims The specific aims of the survey were to identify (i) knowledge of the nutritional needs of aged care facility residents; (ii) mealtime practices; and (iii) attitudes towards mealtime practices and organisation. Methods A survey based on those used in other healthcare settings was completed by 76 staff members. The survey included questions about nutritional knowledge, opinions of the food service, frequency of feeding assistance provided and feeding assessment practices. Results Nutritional knowledge scores ranged from 1 to 9 of a possible 10, with a mean score of 4.67. While 76% of respondents correctly identified risk factors associated with malnutrition in nursing home residents, only 38% of participants correctly identified the need for increased protein and energy in residents with pressure ulcers, and just 15% exhibited correct knowledge of fluid requirements. Further, while nutritional assessment was considered an important part of practice by 83% of respondents, just 53% indicated that they actually carried out such assessments. Identified barriers to promoting optimal nutrition included insufficient time to observe residents (56%); being unaware of residents' feeding issues (46%); poor knowledge of nutritional assessments (44%); and unappetising appearance of food served (57%). Conclusion An important step towards improving health and quality of life for residents of aged care facilities would be to enhance staff nutritional awareness and assessment skills. This should be carried out through increased attention to both preservice curricula and on-the-job training. Implications for practice The residential facility staff surveyed demonstrated low levels of nutrition knowledge, which reflects findings from the international literature. This has implications for the provision of responsive care to residents of these facilities and should be explored further.
Resumo:
Background The largest proportion of cancer patients are aged 65 years and over. Increasing age is also associated with nutritional risk and multi-morbidities—factors which complicate the cancer treatment decision-making process in older patients. Objectives To determine whether malnutrition risk and Body Mass Index (BMI) are associated with key oncogeriatric variables as potential predictors of chemotherapy outcomes in geriatric oncology patients with solid tumours. Methods In this longitudinal study, geriatric oncology patients (aged ≥65 years) received a Comprehensive Geriatric Assessment (CGA) for baseline data collection prior to the commencement of chemotherapy treatment. Malnutrition risk was assessed using the Malnutrition Screening Tool (MST) and BMI was calculated using anthropometric data. Nutritional risk was compared with other variables collected as part of standard CGA. Associations were determined by chi-square tests and correlations. Results Over half of the 175 geriatric oncology patients were at risk of malnutrition (53.1%) according to MST. BMI ranged from 15.5–50.9kg/m2, with 35.4% of the cohort overweight when compared to geriatric cutoffs. Malnutrition risk was more prevalent in those who were underweight (70%) although many overweight participants presented as at risk (34%). Malnutrition risk was associated with a diagnosis of colorectal or lung cancer (p=0.001), dependence in activities of daily living (p=0.015) and impaired cognition (p=0.049). Malnutrition risk was positively associated with vulnerability to intensive cancer therapy (rho=0.16, p=0.038). Larger BMI was associated with a greater number of multi-morbidities (rho =.27, p=0.001. Conclusions Malnutrition risk is prevalent among geriatric patients undergoing chemotherapy, is more common in colorectal and lung cancer diagnoses, is associated with impaired functionality and cognition and negatively influences ability to complete planned intensive chemotherapy.
Resumo:
Human immunodeficiency virus (HIV) that leads to acquired immune deficiency syndrome (AIDs) reduces immune function, resulting in opportunistic infections and later death. Use of antiretroviral therapy (ART) increases chances of survival, however, with some concerns regarding fat re-distribution (lipodystrophy) which may encompass subcutaneous fat loss (lipoatrophy) and/or fat accumulation (lipohypertrophy), in the same individual. This problem has been linked to Antiretroviral drugs (ARVs), majorly, in the class of protease inhibitors (PIs), in addition to older age and being female. An additional concern is that the problem exists together with the metabolic syndrome, even when nutritional status/ body composition, and lipodystrophy/metabolic syndrome are unclear in Uganda where the use of ARVs is on the increase. In line with the literature, the overall aim of the study was to assess physical characteristics of HIV-infected patients using a comprehensive anthropometric protocol and to predict body composition based on these measurements and other standardised techniques. The other aim was to establish the existence of lipodystrophy, the metabolic syndrome, andassociated risk factors. Thus, three studies were conducted on 211 (88 ART-naïve) HIV-infected, 15-49 year-old women, using a cross-sectional approach, together with a qualitative study of secondary information on patient HIV and medication status. In addition, face-to-face interviews were used to extract information concerning morphological experiences and life style. The study revealed that participants were on average 34.1±7.65 years old, had lived 4.63±4.78 years with HIV infection and had spent 2.8±1.9 years receiving ARVs. Only 8.1% of participants were receiving PIs and 26% of those receiving ART had ever changed drug regimen, 15.5% of whom changed drugs due to lipodystrophy. Study 1 hypothesised that the mean nutritional status and predicted percent body fat values of study participants was within acceptable ranges; different for participants receiving ARVs and the HIV-infected ART-naïve participants and that percent body fat estimated by anthropometric measures (BMI and skinfold thickness) and the BIA technique was not different from that predicted by the deuterium oxide dilution technique. Using the Body Mass Index (BMI), 7.1% of patients were underweight (<18.5 kg/m2) and 46.4% were overweight/obese (≥25.0 kg/m2). Based on waist circumference (WC), approximately 40% of the cohort was characterized as centrally obese. Moreover, the deuterium dilution technique showed that there was no between-group difference in the total body water (TBW), fat mass (FM) and fat-free mass (FFM). However, the technique was the only approach to predict a between-group difference in percent body fat (p = .045), but, with a very small effect (0.021). Older age (β = 0.430, se = 0.089, p = .000), time spent receiving ARVs (β = 0.972, se = 0.089, p = .006), time with the infection (β = 0.551, se = 0.089, p = .000) and receiving ARVs (β = 2.940, se = 1.441, p = .043) were independently associated with percent body fat. Older age was the greatest single predictor of body fat. Furthermore, BMI gave better information than weight alone could; in that, mean percentage body fat per unit BMI (N = 192) was significantly higher in patients receiving treatment (1.11±0.31) vs. the exposed group (0.99±0.38, p = .025). For the assessment of obesity, percent fat measures did not greatly alter the accuracy of BMI as a measure for classifying individuals into the broad categories of underweight, normal and overweight. Briefly, Study 1 revealed that there were more overweight/obese participants than in the general Ugandan population, the problem was associated with ART status and that BMI broader classification categories were maintained when compared with the gold standard technique. Study 2 hypothesized that the presence of lipodystrophy in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Results showed that 112 (53.1%) patients had experienced at least one morphological alteration including lipohypertrophy (7.6%), lipoatrophy (10.9%), and mixed alterations (34.6%). The majority of these subjects (90%) were receiving ARVs; in fact, all patients receiving PIs reported lipodystrophy. Period spent receiving ARVs (t209 = 6.739, p = .000), being on ART (χ2 = 94.482, p = .000), receiving PIs (Fisher’s exact χ2 = 113.591, p = .000), recent T4 count (CD4 counts) (t207 = 3.694, p = .000), time with HIV (t125 = 1.915, p = .045), as well as older age (t209 = 2.013, p = .045) were independently associated with lipodystrophy. Receiving ARVs was the greatest predictor of lipodystrophy (p = .000). In other analysis, aside from skinfolds at the subscapular (p = .004), there were no differences with the rest of the skinfold sites and the circumferences between participants with lipodystrophy and those without the problem. Similarly, there was no difference in Waist: Hip ratio (WHR) (p = .186) and Waist: Height ratio (WHtR) (p = .257) among participants with lipodystrophy and those without the problem. Further examination showed that none of the 4.1% patients receiving stavudine (d4T) did experience lipoatrophy. However, 17.9% of patients receiving EFV, a non-nucleoside reverse transcriptase inhibitor (NNRTI) had lipoatrophy. Study 2 findings showed that presence of lipodystrophy in participants receiving ARVs was in fact far higher than that of HIV-infected ART-naïve participants. A final hypothesis was that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Moreover, data showed that many patients (69.2%) lived with at least one feature of the metabolic syndrome based on International Diabetic Federation (IDF, 2006) definition. However, there was no single anthropometric predictor of components of the syndrome, thus, the best anthropometric predictor varied as the component varied. The metabolic syndrome was diagnosed in 15.2% of the subjects, lower than commonly reported in this population, and was similar between the medicated and the exposed groups (χ 21 = 0.018, p = .893). Moreover, the syndrome was associated with older age (p = .031) and percent body fat (p = .012). In addition, participants with the syndrome were heavier according to BMI (p = .000), larger at the waist (p = .000) and abdomen (p = .000), and were at central obesity risk even when hip circumference (p = .000) and height (p = .000) were accounted for. In spite of those associations, results showed that the period with disease (p = .13), CD4 counts (p = .836), receiving ART (p = .442) or PIs (p = .678) were not associated with the metabolic syndrome. While the prevalence of the syndrome was highest amongst the older, larger and fatter participants, WC was the best predictor of the metabolic syndrome (p = .001). Another novel finding was that participants with the metabolic syndrome had greater arm muscle circumference (AMC) (p = .000) and arm muscle area (AMA) (p = .000), but the former was most influential. Accordingly, the easiest and cheapest indicator to assess risk in this study sample was WC should routine laboratory services not be feasible. In addition, the final study illustrated that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants.
Resumo:
Objective To examine whether psychosocial factors mediate (explain) the association between socioeconomic position and takeaway food consumption. Design A cross-sectional postal survey conducted in 2009. Setting Participants reported their usual consumption of 22 takeaway food items, and these were grouped into a “healthy” and “less healthy” index based on each items' nutritional properties. Principal Components Analysis was used to derive three psychosocial scales that measured beliefs about the relationship between diet and health (α = 0.73), and perceptions about the value (α = 0.79) and pleasure (α = 0.61) of takeaway food. A nutrition knowledge index was also used. Socioeconomic position was measured by highest attained education level. Subjects Randomly selected adults (n = 1,500) aged between 25–64 years in Brisbane, Australia (response rate = 63.7%, N = 903). Results Compared with those with a bachelor degree or higher, participants with a diploma level of education were more likely to consume “healthy” takeaway food (p = 0.023) whereas the least educated (high school only) were more likely to consume “less healthy” choices (p = 0.002). The least educated were less likely to believe in a relationship between diet and health (p<0.001), and more likely to have lower nutritional knowledge compared with their highly educated counterparts (p<0.001). Education differences in beliefs about the relationship between diet and health partly and significantly mediated the association between education and “healthy” takeaway food consumption. Diet- and health-related beliefs and nutritional knowledge partly and significantly mediated the education differences in “less healthy” takeaway food consumption. Conclusions Interventions that target beliefs about the relationship between diet and health, and nutritional knowledge may reduce socioeconomic differences in takeaway food consumption, particularly for “less healthy” options.
Resumo:
The advent of liver transplantation for end-stage liver disease (ESLD) in children has necessitated a major rethink in the preoperative preparation and management from simple palliative care to active directed intervention. This is particularly evident in the approach to the nutritional care of these patients with the historical understanding of the nutritional pertubations in ESLD being described from a single pediatric liver transplant center. ESLD in children is a hypermetabolic process adversely affecting nutritional status, metabolic, and non-metabolic body compartments. There is a complex dynamic process affecting metabolic activity within the metabolically active body cell mass, as well as lipid oxidation during fasting and at rest, with other factors operating in conjunction with daily activities. We have proposed that immediately ingested nutrients are a more important source of energy in patients with ESLD than in healthy children, among whom energy may be stored in various body compartments.