183 resultados para Nutritional Intake
Resumo:
Lower energy and protein intakes are well documented in patients on texture modified diets. In acute hospital settings, the provision of appropriate texture modified foods to meet industry standards is essential for patient safety and nutrition outcomes. The texture modified menu at an acute private hospital was evaluated in accordance with their own nutritional standards (NS) and Australian National Standards (Dietitians Association of Australia and Speech Pathology Australia, 2007). The NS documents portion sizes and nutritional requirements for each menu. Texture B and C menus were analysed qualitatively and quantitatively over 9 days of a 6 day cyclic menu for breakfast (n=4), lunch (n=34) and dinner (n=34). Results indicated a lack of portion control, as specified by the NS, across all meals including breakfast (65–140%), soup (55–115%), meat (45–165%), vegetables (55–185%) and desserts (30–300%). Dilution factors and portion sizes influenced the protein and energy availability of Texture B & C menus. While the Texture B menu provided more energy, neither menu met the NS. Limited dessert options on the Texture C menu restricted the ability of this menu to meet protein NS. A lack of portion control and menu items incorrectly modified can compromise protein and energy intakes. Strategies to correct serving sizes and provision of alternate protein sources were recommended. Suggestions included cost-effectively increasing the variety of foods to assist protein and energy intake and the procurement of standardised equipment and visual aids to assist food preparation and presentation in accordance with texture modified guidelines and the NS.
Resumo:
Changing sodium intake from 70-200 mmol/day elevates blood pressure in normotensive volunteers by 6/4 mmHg. Older people, people with reduced renal function on a low sodium diet and people with a family history of hypertension are more likely to show this effect. The rise in blood pressure was associated with a fall in plasma volume suggesting that plasma volume changes do not initiate hypertension. In normotensive individuals the most common abnormality in membrane sodium transport induced by an extra sodium load was an increased permeability of the red cell to sodium. Some normotensive individuals also had an increase in the level of a plasma inhibitor that inhibited Na-K ATPase. These individuals also appeared to have a rise in blood pressure. Sodium intake and blood pressure are related. The relationship differs in different people and is probably controlled by the genetically inherited capacity of systems involved in membrane sodium transport.
Resumo:
Introduction: Smoking status in outpatients with chronic obstructive pulmonary disease (COPD) has been associated with a low body mass index (BMI) and reduced mid-arm muscle circumference (Cochrane & Afolabi, 2004). Individuals with COPD identified as malnourished have also been found to be twice as likely to die within 1 year compared to non-malnourished patients (Collins et al., 2010). Although malnutrition is both preventable and treatable, it is not clear what influence current smoking status, another modifiable risk factor, has on malnutrition risk. The current study aimed to establish the influence of smoking status on malnutrition risk and 1-year mortality in outpatients with COPD. Methods: A prospective nutritional screening survey was carried out between July 2008 and May 2009 at a large teaching hospital (Southampton General Hospital) and a smaller community hospital within Hampshire (Lymington New Forest Hospital). In total, 424 outpatients with a diagnosis of COPD were routinely screened using the ‘Malnutrition Universal Screening Tool’, ‘MUST’ (Elia, 2003); 222 males, 202 females; mean (SD) age 73 (9.9) years; mean (SD) BMI 25.9 (6.4) kg m−2. Smoking status on the date of screening was obtained for 401 of the outpatients. Severity of COPD was assessed using the GOLD criteria, and social deprivation determined using the Index of Multiple Deprivation (Nobel et al., 2008). Results: The overall prevalence of malnutrition (medium + high risk) was 22%, with 32% of current smokers at risk (who accounted for 19% of the total COPD population). In comparison, 19% of nonsmokers and ex-smokers were likely to be malnourished [odds ratio, 1.965; 95% confidence interval (CI), 1.133–3.394; P = 0.015]. Smoking status remained an independent risk factor for malnutrition even after adjustment for age, social deprivation and disease-severity (odds ratio, 2.048; 95% CI, 1.085–3.866; P = 0.027) using binary logistic regression. After adjusting for age, disease severity, social deprivation, smoking status, malnutrition remained a significant predictor of 1-year mortality [odds ratio (medium + high risk versus low risk), 2.161; 95% CI, 1.021–4.573; P = 0.044], whereas smoking status did not (odds ratio for smokers versus ex-smokers + nonsmokers was 1.968; 95% CI, 0.788–4.913; P = 0.147). Discussion: This study highlights the potential importance of combined nutritional support and smoking cessation in order to treat malnutrition. The close association between smoking status and malnutrition risk in COPD suggests that smoking is an important consideration in the nutritional management of malnourished COPD outpatients. Conclusions: Smoking status in COPD outpatients is a significant independent risk factor for malnutrition and a weaker (nonsignificant) predictor of 1-year mortality. Malnutrition significantly predicted 1 year mortality. References: Cochrane, W.J. & Afolabi, O.A. (2004) Investigation into the nutritional status, dietary intake and smoking habits of patients with chronic obstructive pulmonary disease. J. Hum. Nutr. Diet.17, 3–11. Collins, P.F., Stratton, R.J., Kurukulaaratchym R., Warwick, H. Cawood, A.L. & Elia, M. (2010) ‘MUST’ predicts 1-year survival in outpatients with chronic obstructive pulmonary disease. Clin. Nutr.5, 17. Elia, M. (Ed) (2003) The ‘MUST’ Report. BAPEN. http://www.bapen.org.uk (accessed on March 30 2011). Nobel, M., McLennan, D., Wilkinson, K., Whitworth, A. & Barnes, H. (2008) The English Indices of Deprivation 2007. http://www.communities.gov.uk (accessed on March 30 2011).
Resumo:
Background Zambia is a sub-Saharan country with one of the highest prevalence rates of HIV, currently estimated at 14%. Poor nutritional status due to both protein-energy and micronutrient malnutrition has worsened this situation. In an attempt to address this combined problem, the government has instigated a number of strategies, including the provision of antiretroviral (ARV) treatment coupled with the promotion of good nutrition. High-energy protein supplement (HEPS) is particularly promoted; however, the impact of this food supplement on the nutritional status of people living with HIV/AIDS (PLHA) beyond weight gain has not been assessed. Techniques for the assessment of nutritional status utilising objective measures of body composition are not commonly available in Zambia. The aim of this study is therefore to assess the impact of a food supplement on nutritional status using a comprehensive anthropometric protocol including measures of skinfold thickness and circumferences, plus the criterion deuterium dilution technique to assess total body water (TBW) and derive fat-free mass (FFM) and fat mass (FM). Methods/Design This community-based controlled and longitudinal study aims to recruit 200 HIV-infected females commencing ARV treatment at two clinics in Lusaka, Zambia. Data will be collected at four time points: baseline, 4-month, 8-month and 12-month follow-up visits. Outcome measures to be assessed include body height and weight, body mass index (BMI), body composition, CD4, viral load and micronutrient status. Discussion This protocol describes a study that will provide a longitudinal assessment of the impact of a food supplement on the nutritional status of HIV-infected females initiating ARVs using a range of anthropometric and body composition assessment techniques.
Resumo:
Objective Although several validated nutritional screening tools have been developed to “triage” inpatients for malnutrition diagnosis and intervention, there continues to be debate in the literature as to which tool/tools clinicians should use in practice. This study compared the accuracy of seven validated screening tools in older medical inpatients against two validated nutritional assessment methods. Methods This was a prospective cohort study of medical inpatients at least 65 y old. Malnutrition screening was conducted using seven tools recommended in evidence-based guidelines. Nutritional status was assessed by an accredited practicing dietitian using the Subjective Global Assessment (SGA) and the Mini-Nutritional Assessment (MNA). Energy intake was observed on a single day during first week of hospitalization. Results In this sample of 134 participants (80 ± 8 y old, 50% women), there was fair agreement between the SGA and MNA (κ = 0.53), with MNA identifying more “at-risk” patients and the SGA better identifying existing malnutrition. Most tools were accurate in identifying patients with malnutrition as determined by the SGA, in particular the Malnutrition Screening Tool and the Nutritional Risk Screening 2002. The MNA Short Form was most accurate at identifying nutritional risk according to the MNA. No tool accurately predicted patients with inadequate energy intake in the hospital. Conclusion Because all tools generally performed well, clinicians should consider choosing a screening tool that best aligns with their chosen nutritional assessment and is easiest to implement in practice. This study confirmed the importance of rescreening and monitoring food intake to allow the early identification and prevention of nutritional decline in patients with a poor intake during hospitalization.
Resumo:
This report documents the results of a qualitative study of young people experiencing disadvantage who are responsible for feeding themselves. The purpose of the study was to explore the knowledge, skills and behaviours they use in their day to day eating. The results of this study were considered alongside those of an earlier study of Australian food experts in order to develop a definition of food literacy, identify its components and propose a model for its relationship with diet quality and chronic disease. This young people's study also examined how young people's relationship with food developed and its relationship with the social determinants of health. This report will help practitioners working in food literacy better target their practice and investment.
Resumo:
Scope: We examined whether dietary supplementation with fish oil modulates inflammation, fibrosis and oxidative stress following obstructive renal injury. Methods and results: Three groups of Sprague-Dawley rats (n = 16 per group) were fed for 4 wk on normal rat chow (oleic acid), chow containing fish oil (33 g eicosapentaenoic acid and 26 g docosahexaenoic acid per kg diet), or chow containing safflower oil (60 g linoleic acid per kg diet). All diets contained 7% fat. After 4 wk, the rats were further subdivided into four smaller groups (n = 4 per group). Unilateral ureteral obstruction was induced in three groups (for 4, 7 and 14 days). The fourth group for each diet did not undergo surgery, and was sacrificed as controls at 14 days. When rats were sacrificed, plasma and portions of the kidneys were removed and frozen; other portions of kidney tissue were fixed and prepared for histology. Compared with normal chow and safflower oil, fish oil attenuated collagen deposition, macrophage infiltration, TGF-beta expression, apoptosis, and tissue levels of arachidonic acid, MIP-1 alpha, IL-1 beta, MCP-1 and leukotriene B(4). Compared with normal chow, fish oil increased the expression of HO-1 protein in kidney tissue. Conclusions: Fish oil intake reduced inflammation, fibrosis and oxidative stress following obstructive renal injury.
Resumo:
Precise protein quantification and recommendation is essential in clinical dietetics, particularly in the management of individuals with chronic kidney disease, malnutrition, burns, wounds, pressure ulcers, and those in active sports. The Expedited 10g Protein Counter (EP-10) was developed to simplify the quantification of dietary protein for assessment and recommendation of protein intake.1 Instead of using separate protein exchanges for different food groups to quantify the dietary protein intake of an individual, every exchange in the EP-10 accounts for an exchange each of 3g non-protein-rich food and 7g protein-rich food (Table 1). The EP-10 was recently validated and published in the Journal of Renal Nutrition recently.1 This study demonstrated that using the EP-10 for dietary protein intake quantification had clinically acceptable validity and reliability when compared with the conventional 7g protein exchange while requiring less time.2 In clinical practice, the use of efficient, accurate and practical methods to facilitate assessment and treatment plans is important. The EP-10 can be easily implemented in the nutrition assessment and recommendation for a patient in the clinical setting. This patient education tool was adapted from materials printed in the Journal of Renal Nutrition.1 The tool may be used as presented or adapted to assist patients to achieve their recommended daily protein intake.
Resumo:
Exercise could indirectly affect body weight by exerting changes on various components of appetite control, including nutrient and taste preferences, meal size and frequency, and the drive to eat. This review summarizes the evidence on how exercise affects appetite and eating behavior and in particular answers the question, “Does exercise induce an increase in food intake to compensate for the increase in energy expenditure?” Evidence will be presented to demonstrate that there is no automatic increase in food intake in response to acute exercise and that the response to repeated exercise is variable. The review will also identify areas of further study required to explain the variability. One limitation with studies that assess the efficacy of exercise as a method of weight control is that only mean data are presented—the individual variability tends to be overlooked. Recent evidence highlights the importance of characterizing the individual variability by demonstrating exercise-induced changes in appetite. Individuals who experience lower than theoretically predicted reductions in body weight can be characterized by hedonic (eg, pleasure) and homeostatic (eg, hunger) features.