793 resultados para food based dietary guidelines
Resumo:
Objective: Images on food and dietary supplement packaging might lead people to infer (appropriately or inappropriately) certain health benefits of those products. Research on this issue largely involves direct questions, which could (a) elicit inferences that would not be made unprompted, and (b) fail to capture inferences made implicitly. Using a novel memory-based method, in the present research, we explored whether packaging imagery elicits health inferences without prompting, and the extent to which these inferences are made implicitly. Method: In 3 experiments, participants saw fictional product packages accompanied by written claims. Some packages contained an image that implied a health-related function (e.g., a brain), and some contained no image. Participants studied these packages and claims, and subsequently their memory for seen and unseen claims were tested. Results: When a health image was featured on a package, participants often subsequently recognized health claims that—despite being implied by the image—were not truly presented. In Experiment 2, these recognition errors persisted despite an explicit warning against treating the images as informative. In Experiment 3, these findings were replicated in a large consumer sample from 5 European countries, and with a cued-recall test. Conclusion: These findings confirm that images can act as health claims, by leading people to infer health benefits without prompting. These inferences appear often to be implicit, and could therefore be highly pervasive. The data underscore the importance of regulating imagery on product packaging; memory-based methods represent innovative ways to measure how leading (or misleading) specific images can be. (PsycINFO Database Record (c) 2016 APA, all rights reserved)
Resumo:
In the new health paradigm, the connotation of health has extended beyond the measures of morbidity and mortality to include wellness and quality of life. Comprehensive assessments of health go beyond traditional biological indicators to include measures of physical and mental health status, social role-functioning, and general health perceptions. To meet these challenges, tools for assessment and outcome evaluation are being designed to collect information about functioning and well-being from the individual's point of view.^ The purpose of this study was to profile the physical and mental health status of a sample of county government employees against U.S. population norms. A second purpose of the study was to determine if significant relationships existed between respondent characteristics and personal health practices, lifestyle and other health how the tools and methods used in this investigation can be used to guide program development and facilitate monitoring of health promotion initiatives.^ The SF-12 Health Survey (Ware, Kosinski, & Keller, 1995), a validated measure of health status, was administered to a convenience sample of 450 employees attending one of nine health fairs at an urban worksite. The instrument has been utilized nationally which enabled a comparative analysis of findings of this study with national results.^ Results from this study demonstrated that several respondent characteristics and personal health practices were associated with a greater percentage of physical and/or mental scale scores that were significantly "worse" or significantly "better" than the general population. Respondent characteristics that were significantly related to the SF-12 physical and/or mental health scale scores were gender, age, education, ethnicity, and income status. Personal health practices that were significantly related to SF-12 physical and/or mental scale scores were frequency of vigorous exercise, presence of chronic illness, being at one's prescribed height and weight, eating breakfast, smoking and drinking status. This study provides an illustration of the methods used to analyze and interpret SF-12 Health Survey data, using norm-based interpretation guidelines which are useful for purposes of program development and collecting information on health at the community level. ^
Resumo:
Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes.
Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT). Participants were randomised to the intervention (computer-program) or control group (usual care). Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ)) measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD) years). After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand.
Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management.
Resumo:
Vitamin A (VA) deficiency (VAD) is a major nutritional public health problem among children under-5-years-old in the developing world including Kenya. A community-based cross-sectional survey among 1,630 children (aged 6-23 mos) was undertaken in Western Kenya. A questionnaire was administered to collect demographic, socio-economic and dietary intake information. Prevalence of low retinol-binding protein (RBP) concentrations was assessed using Dried Blood Spot (DBS) methodology. Analysis of RBP was carried out using rapid enzyme immunoassay (EIA) and C-reactive protein (CRP) was carried out using enzyme linked immunosorbent assay (ELISA) to estimate VA and sub-clinical inflammation statuses, respectively. Values were adjusted for influence of inflammation using CRP (CRP >5 mg/L) and population prevalence of VAD (RBP <0.825 μmol/L, biologically equivalent to 0.70 μmol/L retinol) estimated. Anthropometric data gave three indices: stunting, wasting and underweight—all of which took age and sex into consideration. Mean (geometric± SD) concentration of RBP was adequate (1.56±0.79μmol/L) but the inflammation-adjusted mean (±SE) prevalence of VAD was high (20.1±1.1%) in this population. The level of CRP was within normal range (1.06±4.95 mg/L) whilst 18.4±0.9% of the children had subclinical inflammation (CRP>5 mg/L). Intake of VA capsule (VAC) by a child was a predictor of VAD with children who have not taken VA during the past 1 year prior to the survey having a 30% increased risk of VAD (OR (CI): 1.3 (1.1-1.7); p=0.025. Additionally, age of the child was a predictor with older children (18-23 mos) having a 30 % increased risk of VAD (OR (CI): 1.3 (1.1-1.9); p=0.035); the caretaker’s knowledge on VA and nutrition was also a predictor of VAD with children whose caretaker’s had poor knowledge having a 40 % increased risk of VAD (OR (CI): 1.4 (1.0-1.9); p=0.027. A child’s district of residence was also a significant predictor of VAD. Prevalence of VAD in this sample of infants was high. Predictors of VAD included child intake of VAC in the last 1 year before the survey, older children, children whose caretakers had poor VA and nutritional knowledge and a child’s district of residence. There is a need to improve knowledge on nutrition and VA of caretakers; undertake a targeted VAC distribution, particularly in children older than 1 year and above and use a sustainable food-based intervention in the areas with severe VAD.
Resumo:
The health of people living with HIV and AIDS (PLWHA) is nutritionally challenged in many nations of the world. The scourge has reduced socio-economic progress globally and more so in sub-Saharan Africa (SSA) where its impact has been compounded by poverty and food insecurity. Good nutrition with proper drug use improves the quality of life for those infected but it is not known how PLWHA exposed to chronic malnutrition and food shortages from developing nations adjust their nutrition with use of Anti-Retro-viral Drugs (ARVs). This study assessed nutritional status, dietary practices, and dietary management of common illnesses that hinder daily food intake by the patients and use of ARVs with food recommendations provided by the health care givers. A descriptive case study design was used to sample 120 HIV-infected patients using systematic sampling procedure. These patients sought health care from an urban slum, Kibera AMREF clinic. Data were collected by anthropometric measurements, bio-chemical analysis, semi-structured questionnaire and secondary data. The Statistical Package for Social Sciences (SPSS) and the Nutri-Survey software packages were used to analyze data. Dietary intakes of micro-nutrients were inadequate for >70% of the patients when compared to the Recommended Daily Requirements. When Body Mass Indices (BMI) were used, only 6.7% of the respondents were underweight (BMI<18.5kg/m2) and 9.2% were overweight (BMI> 25kg/m2), serum albumin test results (mean 3.34±0.06g/dl) showed 60.8% of the respondents were protein deficient and this was confirmed by low dietary protein intakes. The BMI was not related to dietary nutrient intakes, serum albumin and CD4 cell counts (p>0.05). It appeared that there was no significant difference in BMI readings at different categories of CD4 cell count (p>0.05) suggesting that the level of immunity did not affect weight gain with ARV as observed in many studies from developed countries. Malnutrition was, therefore, evident among the 60.8% of the cases as identified by serum albumin tests and food intake was not adequate (68%) for the patients as they ate once a day due to lack of food. National food and nutrition policy should incorporate food security boosting guidelines for the poor people infected with HIV and using ARVs.
Resumo:
Dietary fiber was classified according to its solubility in an attempt to relate physiological effects to chemical types of fiber. Soluble fibers (B-glucans, gums, wheat dextrin, psyllium, pectin, inulin) were considered to have benefits on serum lipids, while insoluble fibers (cellulose, lignin, pectins, hemicelluloses) were linked with laxation benefits. More important characteristics of fiber in terms of physiological benefits are viscosity and fermentability. Viscous fibers (pectins, B-glucans, gums, psyllium) are those that have gel-forming properties in the intestinal tract, and fermentable fibers (wheat dextrin, pectins, B-glucans, gum, inulin) are those that can be metabolized by colonic bacteria. Objective: To summarize the beneficial effects of dietary fiber, as nutraceuticals, in order to maintain a healthy gastrointestinal system. Methods: Our study is a systematic review. Electronic databases, including PubMed, Medline, with supplement of relevant websites, were searched. We included randomized and non-randomized clinical trials, epidemiological studies (cohort and case-control). We excluded case series, case reports, in vitro and animal studies. Results: The WHO, the U.S. Food and Drug Administration (FDA), the Heart Foundation and the Romanian Dietary Guidelines recommends that adults should aim to consume approximately 25–30 g fiber daily. Dietary fiber is found in the indigestible parts of cereals, fruits and vegetables. There are countries where people don’t eat enough food fibers, these people need to take some kind of fiber supplement. Evidence has been found that dietary fiber from whole foods or supplements may (1) reduce the risk of cardiovascular disease by improving serum lipids and reducing serum total and low-density lipoprotein (LDL) cholesterol concentrations, (2) decreases the glycaemic index of foods, which leads to an improvement in glycemic response, positive impact on diabetes, (3) protect against development of obesity by increasing satiety hormone leptin concentrations, (4) reduced risk of developing colorectal cancer by normalizes bowel movements, improve the integrity of the epithelial layer of the intestines, increase the resistance against pathogenic colonization, have favorable effects on the gut microbiome, wich is the second genomes of the microorganisms, (5) have a positive impact on the endocrine system by gastrointestinal polypeptide hormonal regulation of digestion, (6) have prebiotic effect by short-chain fatty acids (SCFA) production; butyrate acid is the preferred energy source for colonic epithelial cells, promotes normal cell differentiation and proliferation, and also help regulate sodium and water absorption, and can enhance absorption of calcium and other minerals. Although all prebiotics are fiber, not all fiber is prebiotic. This generally refers to the ability of a fiber to increase the growth of bifidobacteria and lactobacilli, which are beneficial to human health, and (7) play a role in improving immune function via production of SCFAs by increases T helper cells, macrophages, neutrophils, and increased cytotoxic activity of natural killer cells. Conclusion: Fiber consumption is associated with high nutritional value and antioxidant status of the diet, enhancing the effects on human health. Fibers with prebiotic properties can also be recommended as part of fiber intake. Due to the variability of fiber’s effects in the body, it is important to consume fiber from a variety of sources. Increasing fiber consumption for health promotion and disease prevention is a critical public health goal.
Resumo:
Résumé : Contexte: Les maladies cardiovasculaires (MCV) sont un enjeu contemporain de santé publique. Or, des recherches cliniques démontrent que certaines interventions sont efficaces dans leur traitement et prévention. Il s’agit d’interventions nutritionnelles éducatives priorisant des aliments végétaux minimalement transformés (VMT). Ces interventions promeuvent l’adoption de postures alimentaires se caractérisant par la consommation à volonté d’une grande variété d’aliments d’origine végétale (e.g. légumineuses, céréales entières, fruits, légumes) et par une diminution de la consommation d’aliments d’origine animale (e.g. viandes, œufs et produits laitiers) et ultra-transformés (e.g. riches en sucres, sel ou gras, et faibles en fibres). Objectifs: À l’aide d’un devis mixte concomitant imbriqué, nous avons évalué les effets d’un programme d’interventions éducatives visant à augmenter la consommation de VMT chez des adultes à risque de MCV et exploré les déterminants des modifications comportementales observées. Méthodologie : Divers paramètres physiologiques et anthropométriques ont été mesurés pré-post programme (n = 72) puis analysés avec un test t pour échantillons appariés ou un test signé des rangs de Wilcoxon. D’autre part, 10 entretiens semi-dirigés ont été réalisés post-programme et soutenus par un guide d’entretien basé sur le Food Choice Process Model. Les verbatims intégraux ont été codés selon la méthode d’analyse thématique. Résultats : Après 12 semaines, le poids (-10,5 lb, IC 95 %: 9,0-12,0), le tour de taille (-7,4 cm, IC 95 %:6,5-8,4), la tension artérielle diastolique (-3,2 mmHg, IC 95 %: 0,1-6,3), le cholestérol total (-0,87 mmol/ L, IC 95 %:0,57-1,17), le cholestérol LDL (-0,84 mmol/ L, IC 95 %: 0,55-1,13) et l’hémoglobine glyquée (-1,32 %, IC 95 %:-0,17-2,80) se sont significativement améliorés. L’analyse thématique des données qualitatives révèle que le programme, par la stimulation de valeurs de santé, d’éthique et d’intégrité, favorise la transformation des choix alimentaires vers une posture davantage axée sur les VMT durant une période clé du parcours de vie (i.e. pré-retraite). D’autres déterminants pouvant favoriser l’adoption d’une alimentation VMT ont été identifiés, dont les bénéfices importants observables à court terme, l’absence de restriction à l’égard de la quantité d’aliments VMT et le développement de compétences de planification dans l’acquisition et la préparation des aliments. Conclusion : Une intervention priorisant les VMT permet d’améliorer le profil cardiométabolique d’individus pré-retraités en raison de ses caractéristiques intrinsèques, mais aussi parce qu’elle modifie les valeurs impliquées dans les choix alimentaires.
Resumo:
Flinders University and Queensland University of Technology, biofuels research interests cover a broad range of activities. Both institutions are seeking to overcome the twin evils of "peak oil" (Hubbert 1949 & 1956) and "global warming" (IPPC 2007, Stern 2006, Alison 2010), through development of Generation 1, 2 and 3 (Gen-1, 2 & 3) biofuels (Clarke 2008, Clarke 2010). This includes development of parallel Chemical Biorefinery, value-added, co-product chemical technologies, which can underpin the commercial viability of the biofuel industry. Whilst there is a focused effort to develop Gen-2 & 3 biofuels, thus avoiding the socially unacceptable use of food based Gen-1 biofuels, it must also be recognized that as yet, no country in the world has produced sustainable Gen-2 & 3 biofuel on a commercial basis. For example, in 2008 the United States used 38 billion litres (3.5% of total fuel use) of Gen-1 biofuel; in 2009/2010 this will be 47.5 billion litres (4.5% of fuel use) and in 2018 this has been estimated to rise to 96 billion litres (9% of total US fuel use). Brazil in 2008 produced 24.5 billion litres of ethanol, representing 37.3% of the world’s ethanol use for fuel and Europe, in 2008, produced 11.7 billion litres of biofuel (primarily as biodiesel). Compare this to Australia’s miserly biofuel production in 2008/2009 of 180 million litres of ethanol and 75 million litres of biodiesel, which is 0.4% of our fuel consumption! (Clarke, Graiver and Habibie 2010) To assist in the development of better biofuels technologies in the Asian developing regions the Australian Government recently awarded the Materials & BioEnergy Group from Flinders University, in partnership with the Queensland University of Technology, an Australian Leadership Award (ALA) Biofuel Fellowship program to train scientists from Indonesia and India about all facets of advanced biofuel technology.
Resumo:
Vitamin D, along with calcium, may help decrease the risk of falls and fractures in older adults. Sunlight and other sources of ultraviolet radiation are not recommended because they increase the risk of skin cancers and sun-induced eye disorders. Rather, vitamin D and calcium needs should be met through foods and dietary supplements. As a preventive measure to reduce the risk of falls and fractures, it is recommended that older adults meet the 2005 Dietary Guidelines and consume 1000 IU of vitamin D, preferably as vitamin D3.
Resumo:
Purpose: Heart failure (HF) is the leading cause of hospitalization and significant burden to the health care system in Australia. To reduce hospitalizations, multidisciplinary approaches and enhance self-management programs have been strongly advocated for HF patients globally. HF patients who can effectively manage their symptoms and adhere to complex medicine regimes will experience fewer hospitalizations. Research indicates that information technologies (IT) have a significant role in providing support to promote patients' self-management skills. The iPad utilizes user-friendly interfaces and to date an application for HF patient education has not been developed. This project aimed to develop the HF iPad teaching application in the way that would be engaging, interactive and simple to follow and usable for patients' carers and health care workers within both the hospital and community setting. Methods: The design for the development and evaluation of the application consisted of two action research cycles. Each cycle included 3 phases of testing and feedback from three groups comprising IT team, HF experts and patients. All patient education materials of the application were derived from national and international evidence based practice guidelines and patient self-care recommendations. Results: The iPad application has animated anatomy and physiology that simply and clearly teaches the concepts of the normal heart and the heart in failure. Patient Avatars throughout the application can be changed to reflect the sex and culture of the patient. There is voice-over presenting a script developed by the heart failure expert panel. Additional engagement processes included points of interaction throughout the application with touch screen responses and the ability of the patient to enter their weight and this data is secured and transferred to the clinic nurse and/or research data set. The application has been used independently, for instance, at home or using headphones in a clinic waiting room or most commonly to aid a nurse-led HF consultation. Conclusion: This project utilized iPad as an educational tool to standardize HF education from nurses who are not always heart failure specialists. Furthermore, study is currently ongoing to evaluate of the effectiveness of this tool on patient outcomes and to develop several specifically designed cultural adaptations [Hispanic (USA), Aboriginal (Australia), and Maori (New Zealand)].
Resumo:
Background & aim: This paper describes nutrition care practices in acute care hospitals across Australia and New Zealand. Methods: A survey on nutrition care practices in Australian and New Zealand hospitals was completed by Directors of dietetics departments of 56 hospitals that participated in the Australasian Nutrition Care Day Survey 2010. Results: Overall 370 wards representing various specialities participated in the study. Nutrition risk screening was conducted in 64% (n=234) of the wards. Seventy nine percent(n=185) of these wards reported using the Malnutrition Screening Tool, 16% using the Malnutrition Universal Screening Tool (n=37), and 5% using local tools (n=12). Nutrition risk rescreening was conducted in 14% (n=53) of the wards. More than half the wards referred patients at nutrition risk to dietitians and commenced a nutrition intervention protocol. Feeding assistance was provided in 89% of the wards. “Protected” meal times were implemented in 5% of the wards. Conclusion: A large number of acute care hospital wards in Australia and New Zealand do not comply with evidence-based practice guidelines for nutritional management of malnourished patients. This study also provides recommendations for practice.
Resumo:
Background Acute respiratory illness, a leading cause of cough in children, accounts for a substantial proportion of childhood morbidity and mortality worldwide. In some children acute cough progresses to chronic cough (> 4 weeks duration), impacting on morbidity and decreasing quality of life. Despite the importance of chronic cough as a cause of substantial childhood morbidity and associated economic, family and social costs, data on the prevalence, predictors, aetiology and natural history of the symptom are scarce. This study aims to comprehensively describe the epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children presenting to a tertiary paediatric emergency department. Methods/design A prospective cohort study of children aged <15 years attending the Royal Children's Hospital Emergency Department, Brisbane, for a respiratory illness that includes parent reported cough (wet or dry) as a symptom. The primary objective is to determine the prevalence and predictors of chronic cough (>= 4 weeks duration) post presentation with acute respiratory illness. Demographic, epidemiological, risk factor, microbiological and clinical data are completed at enrolment. Subjects complete daily cough dairies and weekly follow-up contacts for 28(+/-3) days to ascertain cough persistence. Children who continue to cough for 28 days post enrolment are referred to a paediatric respiratory physician for review. Primary analysis will be the proportion of children with persistent cough at day 28(+/-3). Multivariate analyses will be performed to evaluate variables independently associated with chronic cough at day 28(+/-3). Discussion Our protocol will be the first to comprehensively describe the natural history, epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children. The results will contribute to studies leading to the development of evidence-based clinical guidelines to improve the early detection and management of chronic cough in children during and after acute respiratory illness.
Resumo:
Purpose The use of intravascular devices is associated with a number of potential complications. Despite a number of evidence-based clinical guidelines in this area, there continues to be nursing practice discrepancies. This study aims to examine nursing practice in a cancer care setting to identify nursing practice and areas for improvement respective to best available evidence. Methods A point prevalence survey was undertaken in a tertiary cancer care centre in Queensland, Australia. On a randomly selected day, four nurses assessed intravascular device related nursing practices and collected data using a standardized survey tool. Results 58 inpatients (100%) were assessed. Forty-eight (83%) had a device in situ, comprising 14 Peripheral Intravenous Catheters (29.2%), 14 Peripherally Inserted Central Catheters (29.2%), 14 Hickman catheters (29.2%) and six Port-a-Caths (12.4%). Suboptimal outcomes such as incidences of local site complications, incorrect/inadequate documentation, lack of flushing orders, and unclean/non intact dressings were observed. Conclusions This study has highlighted a number of intravascular device related nursing practice discrepancies compared with current hospital policy. Education and other implementation strategies can be applied to improve nursing practice. Following education strategies, it will be valuable to repeat this survey on a regular basis to provide feedback to nursing staff and implement strategies to improve practice. More research is required to provide evidence to clinical practice with regards to intravascular device related consumables, flushing technique and protocols.
Resumo:
Background Prevention strategies are critical to reduce infection rates in total joint arthroplasty (TJA), but evidence-based consensus guidelines on prevention of surgical site infection (SSI) remain heterogeneous and do not necessarily represent this particular patient population. Questions/Purposes What infection prevention measures are recommended by consensus evidence-based guidelines for prevention of periprosthetic joint infection? How do these recommendations compare to expert consensus on infection prevention strategies from orthopedic surgeons from the largest international tertiary referral centers for TJA? Patients and Methods A review of consensus guidelines was undertaken as described by Merollini et al. Four clinical guidelines met inclusion criteria: Centers for Disease Control and Prevention's, British Orthopedic Association, National Institute of Clinical Excellence's, and National Health and Medical Research Council's (NHMRC). Twenty-eight recommendations from these guidelines were used to create an evidence-based survey of infection prevention strategies that was administered to 28 orthopedic surgeons from members of the International Society of Orthopedic Centers. The results between existing consensus guidelines and expert opinion were then compared. Results Recommended strategies in the guidelines such as prophylactic antibiotics, preoperative skin preparation of patients and staff, and sterile surgical attire were considered critically or significantly important by the surveyed surgeons. Additional strategies such as ultraclean air/laminar flow, antibiotic cement, wound irrigation, and preoperative blood glucose control were also considered highly important by surveyed surgeons, but were not recommended or not uniformly addressed in existing guidelines on SSI prevention. Conclusion Current evidence-based guidelines are incomplete and evidence should be updated specifically to address patient needs undergoing TJA.
Resumo:
Tribolium castaneum Herbst (Coleoptera: Tenebrionidae) is a common stored grain pest for which a wide range of suitable resources has been recorded. These beetles are facultatively fungivorous and their resource range may extend to fungi associated with non-grain resources (e.g. cotton seed) and other decaying plant matter. Little is known with respect to fungi in terms of resource location by these beetles in the field. We, therefore, conducted a series of experiments in laboratory arenas, glasshouse cages and the field to determine how beetles respond to grain resources in relation to cotton seed (together with its lint stubble and associated fungal flora). Results from the tests conducted in relatively small arenas and cages in the laboratory and glasshouse reveal that the responses of T. castaneum adults to food resources were twice as strong when walking as when flying (as measured by the proportion of the released beetles that were trapped). Also, a clear preference for linted cotton seeds was evident in walking T. castaneum, especially in small-scale arenas in the laboratory, where at least 60% of beetles released preferred linted cotton seeds over wheat and sorghum. Similarly, in cages (1 m3) they responded five times more strongly to linted cotton seed than to conventional grain resources. However, this pattern was not consistent with those obtained from field trapping over 20 m and the beetles did not show any particular preference to any of the resources tested above. Our results suggest a focus on walking beetles in trapping studies for population estimations and, for developing effective food-based trapping lures, the potential use of active volatiles from the fungi associated with linted cotton seed. © 2012 Elsevier Ltd.