771 resultados para Dietary habits
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
The current study evaluated the effect of soluble dietary cellulose on growth, survival and digestive enzyme activity in three endemic, Australian freshwater crayfish species (redclaw: Cherax quadricarinatus, marron: C. tenuimanus, yabby: C. destructor). Separate individual feeding trials were conducted for late-stage juveniles from each species in an automated recirculating freshwater, culture system. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch, over a 12 week period. Redclaw fed with RD showed significantly higher (p<0.05) specific growth rates (SGR) compared with animals fed the TD, while SGR of marron and yabby fed the two diets were not significantly different. Expressed cellulase activity levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD (p<0.05). Amylase and protease activity in all three species were significantly higher in the animals fed with RD (p<0.05). These results indicate that test animals of all three species appear to utilize starch more efficiently than soluble dietary cellulose in their diet. The inclusion of 20% soluble cellulose in diets did not appear, however, to have a significant negative effect on growth rates.
Resumo:
Background & aims: - Excess adiposity (overweight) is one of numerous risk factors for cardiometabolic disease. Most risk reduction strategies for overweight rely on weight loss through dietary energy restriction. However, since the evidence base for long-term successful weight loss interventions is scant, it is important to identify strategies for risk reduction independent of weight loss. The aim of this study was to compare the effects of isoenergetic substitution of dietary saturated fat (SFA) with monounsaturated fat (MUFA) via macadamia nuts on coronary risk compared to usual diet in overweight adults. Methods: - A randomised controlled trial design, maintaining usual energy intake, but manipulating dietary lipid profile in a group of 64 (54 female, 10 male) overweight (BMI > 25), otherwise healthy, subjects. For the intervention group, energy intakes of usual (baseline) diets were calculated from multiple 3 day diet diaries, and SFA was replaced with MUFA (target: 50%E from fat as MUFA) by altering dietary SFA sources and adding macadamia nuts to the diet. Both control and intervention groups received advice on national guidelines for physical activity and adhered to the same protocol for diet diary record keeping and trial consultations. Anthropometric and clinical measures were taken at baseline and at 10 weeks. Results: A significant increase in brachial artery flow-mediated dilation (p < 0.05) was seen in the monounsaturated diet group at week 10 compared to baseline. This corresponded to significant decreases in waist circumference, total cholesterol (p < 0.05), plasma leptin and ICAM-1 (p < 0.01). Conclusions: - In patient subgroups where adherence to dietary energy-reduction is poor, isoenergetic interventions may improve endothelial function and other coronary risk factors without changes in body weight. This trial was registered with the Australia New Zealand Clinical Trial Registry (ACTRN12607000106437).
Resumo:
BACKGROUND/OBJECTIVE: To investigate the extent of baseline psychosocial characterisation of subjects in published dietary randomised controlled trials (RCTs) for weight loss. SUBJECTS/METHODS: Systematic review of adequately sized (nX10) RCTs comprising X1 diet-alone arm for weight loss were included for this systematic review. More specifically, trials included overweight (body mass index 425 kg/m2) adults, were of duration X8 weeks and had body weight as the primary outcome. Exclusion criteria included specific psychological intervention (for example, Cognitive Behaviour Therapy (CBT)), use of web-based tools, use of supplements, liquid diets, replacement meals and very-low calorie diets. Physical activity intervention was restricted to general exercise only (not supervised or prescribed, for example, VO2 maximum level). RESULTS: Of 176 weight-loss RCTs published during 2008–2010, 15 met selection criteria and were assessed for reported psychological characterisation of subjects. All studies reported standard characterisation of clinical and biochemical characteristics of subjects. Eleven studies reported no psychological attributes of subjects (three of these did exclude those taking psychoactive medication). Three studies collected data on particular aspects of psychology related to specific research objectives (figure scale rating, satiety and quality-of-life). Only one study provided a comprehensive background on psychological attributes of subjects. CONCLUSION: Better characterisation in behaviour-change interventions will reduce potential confounding and enhance generalisability of such studies.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.
Resumo:
Background & aims Depression has a complex association with cardiometabolic risk, both directly as an independent factor and indirectly through mediating effects on other risk factors such as BMI, diet, physical activity, and smoking. Since changes to many cardiometabolic risk factors involve behaviour change, the rise in depression prevalence as a major global health issue may present further challenges to long-term behaviour change to reduce such risk. This study investigated associations between depression scores and participation in a community-based weight management intervention trial. Methods A group of 64 overweight (BMI > 27), otherwise healthy adults, were recruited and randomised to follow either their usual diet, or an isocaloric diet in which saturated fat was replaced with monounsaturated fat (MUFA), to a target of 50% total fat, by adding macadamia nuts to the diet. Subjects were assessed for depressive symptoms at baseline and at ten weeks using the Beck Depression Inventory (BDI-II). Both control and intervention groups received advice on National Guidelines for Physical Activity and adhered to the same protocol for food diary completion and trial consultations. Anthropometric and clinical measurements (cholesterol, inflammatory mediators) also were taken at baseline and 10 weeks. Results During the recruitment phase, pre-existing diagnosed major depression was one of a range of reasons for initial exclusion of volunteers from the trial. Amongst enrolled participants, there was a significant correlation (R = −0.38, p < 0.05) between BDI-II scores at baseline and duration of participation in the trial. Subjects with a baseline BDI ≥10 (moderate to severe depression symptoms) were more likely to dropout of the trial before week 10 (p < 0.001). BDI-II scores in the intervention (MUFA) diet group decreased, but increased in the control group over the 10-week period. Univariate analysis of variance confirmed these observations (adjusted R2 = 0.257, p = 0.01). Body weight remained static over the 10-week period in the intervention group, corresponding to a relative increase in the control group (adjusted R2 = 0.097, p = 0.064). Conclusions Depression symptoms have the potential to affect enrolment in and adherence to dietbased risk reduction interventions, and may consequently influence the generalisability of such trials. Depression scores may therefore be useful for characterising, screening and allocating subjects to appropriate treatment pathways.
Resumo:
Background & Aims: Access to sufficient amounts of safe and culturally-acceptable foods is a fundamental human right. Food security exists when all people, at all times, have physical, social, and economic access to sufficient, safe and nutritious food to meet their dietary needs and food preferences for an active and healthy life. Food insecurity therefore occurs when the availability or access to sufficient amounts of nutritionally-adequate, culturally-appropriate and safe foods, or, the ability to acquire such foods in socially-acceptable ways, is limited. Food insecurity may result in significant adverse effects for the individual and these outcomes may vary between adults and children. Among adults, food insecurity may be associated with overweight or obesity, poorer self-rated general health, depression, increased health-care utilisation and dietary intakes less consistent with national recommendations. Among children, food insecurity may result in poorer self or parent-reported general health, behavioural problems, lower levels of academic achievement and poor social outcomes. The majority of research investigating the potential correlates of food insecurity has been undertaken in the United States (US), where regular national screening for food insecurity is undertaken using a comprehensive multi-item measurement. In Australia, screening for food insecurity takes place on a three yearly basis via the use of a crude, single-item included in the National Health Survey (NHS). This measure has been shown to underestimate the prevalence of food insecurity by 5%. From 1995 – 2004, the prevalence of food insecurity among the Australian population remained stable at 5%. Due to the perceived low prevalence of this issue, screening for food insecurity was not undertaken in the most recent NHS. Furthermore, there are few Australian studies investigating the potential determinants of food insecurity and none investigating potential outcomes among adults and children. This study aimed to examine these issues by a) investigating the prevalence of food insecurity among households residing in disadvantaged urban areas and comparing prevalence rates estimated by the more comprehensive 18-item and 6-item United States Department of Agriculture (USDA) Food Security Survey Module (FSSM) to those estimated by the current single-item measure used for surveillance in Australia and b) investigating the potential determinants and outcomes of food insecurity, Methods: A comprehensive literature review was undertaken to investigate the potential determinants and consequences of food insecurity among developed countries. This was followed by a cross-sectional study in which 1000 households from the most disadvantaged 5% of Brisbane areas were sampled and data collected via mail-based survey (final response rate = 53%, n = 505). Data were collected for food security status, sociodemographic characteristics (household income, education, age, gender, employment status, housing tenure and living arrangements), fruit and vegetable intakes, meat and take-away consumption, presence of depressive symptoms, presence of chronic disease and body mass index (BMI) among adults. Among children, data pertaining to BMI, parent-reported general health, days away from school and activities and behavioural problems were collected. Rasch analysis was used to investigate the psychometric properties of the 18-, 10- and 6-item adaptations of the USDA-FSSM, and McNemar's test was used to investigate the difference in the prevalence of food insecurity as measured by these three adaptations compared to the current single-item measure used in Australia. Chi square and logistic regression were used to investigate the differences in dietary and health outcomes among adults and health and behavioural outcomes among children. Results were adjusted for equivalised household income and, where necessary, for indigenous status, education and family type. Results: Overall, 25% of households in these urbanised-disadvantaged areas reported experiencing food insecurity; this increased to 34% when only households with children were analysed. The current reliance on a single-item measure to screen for food insecurity may underestimate the true burden among the Australian population, as this measure was shown to significantly underestimate the prevalence of food insecurity by five percentage points. Internationally, major potential determinants of food insecurity included poverty and indicators of poverty, such as low-income, unemployment and lower levels of education. Ethnicity, age, transportation and cooking and financial skills were also found to be potential determinants of food insecurity. Among Australian adults in disadvantaged urban areas, food insecurity was associated with a three-fold increase in experiencing poorer self-rated general health and a two-to-five-fold increase in the risk of depression. Furthermore, adults from food insecure households were twoto- three times more likely to have seen a general practitioner and/or been admitted to hospital within the previous six months, compared to their food secure counterparts. Weight status and intakes of fruits, vegetables and meat were not associated with food insecurity. Among Australian households with children, those in the lowest tertile were over 16 times more likely to experience food insecurity compared to those in the highest tertile for income. After adjustment for equivalised household income, children from food insecure households were three times more likely to have missed days away from school or other activities. Furthermore, children from food insecure households displayed a two-fold increase in atypical emotions and behavioural difficulties. Conclusions: Food insecurity is an important public health issue and may contribute to the burden on the health care system through its associations with depression and increased health care utilisation among adults and behavioural and emotional problems among children. Current efforts to monitor food insecurity in Australia do not occur frequently and use a tool that may underestimate the prevalence of food insecurity. Efforts should be made to improve the regularity of screening for food insecurity via the use of a more accurate screening measure. Most of the current strategies that aim to alleviate food insecurity do not sufficiently address the issue of insufficient financial resources for acquiring food; a factor which is an important determinant of food insecurity. Programs to address this issue should be developed in collaboration with groups at higher risk of developing food insecurity and should incorporate strategies to address the issue of low income as a barrier to food acquisition.
Resumo:
Several studies published in the last few decades have demonstrated a low price-elasticity for residential water use. In particular, it has been shown that there is a quantity of water demanded that remains constant regardless of prices and other economic factors. In this research, we characterise residential water demand based on a Stone-Geary utility function. This specification is not only theory-compatible but can also explicitly model a minimum level of consumption not dependent on prices or income. This is described as minimum threshold or nondiscretionary water use. Additionally, the Stone-Geary framework is used to model the subsistence level of water consumption that is dependent on the temporal evolution of consumer habits and stock of physical capital. The main aim of this study is to analyse the impact of water-saving habits and water-efficient technologies on residential water demand, while additionally focusing attention on nondiscretionary uses. This is informed by an empirical application using data from a survey conducted among residents of Brisbane City Council, Australia. The results will be especially useful in the design of water tariffs and other water-saving policies.
Resumo:
Objective: To evaluate responses to self-administered brief questions regarding consumption of vegetables and fruit by comparison with blood levels of serum carotenoids and red-cell folate. Design: A cross-sectional study in which participants reported their usual intake of fruit and vegetables in servings per day, and serum levels of five carotenoids (α-carotene, β-carotene, β-cryptoxanthin, lutein/zeaxanthin and lycopene) and red-cell folate were measured. Serum carotenoid levels were determined by high-performance liquid chromatography, and red-cell folate by an automated immunoassay system. Settings and subjects: Between October and December 2000, a sample of 1598 adults aged 25 years and over, from six randomly selected urban centres in Queensland, Australia, were examined as part of a national study conducted to determine the prevalence of diabetes and associated cardiovascular risk factors. Results: Statistically significant (P<0.01) associations with vegetable and fruit intake (categorised into groups: ≤1 serving, 2–3 servings and ≥4 servings per day) were observed for α-carotene, β-carotene, β-cryptoxanthin, lutein/zeaxanthin and red-cell folate. The mean level of these carotenoids and of red-cell folate increased with increasing frequency of reported servings of vegetables and fruit, both before and after adjusting for potential confounding factors. A significant association with lycopene was observed only for vegetable intake before adjusting for confounders. Conclusions: These data indicate that brief questions may be a simple and valuable tool for monitoring vegetable and fruit intake in this population.
Resumo:
Low circulating folate concentrations lead to elevations of plasma homocysteine. Even mild elevations of plasma homocysteine are associated with significantly increased risk of cardiovascular disease (CVD). Available evidence suggests that poor nutrition contributes to excessive premature CVD mortality in Australian Aboriginal people. The aim of the present study was to examine the effect of a nutrition intervention program conducted in an Aboriginal community on plasma homocysteine concentrations in a community-based cohort. From 1989, a health and nutrition project was developed, implemented and evaluated with the people of a remote Aboriginal community. Plasma homocysteine concentrations were measured in a community-based cohort of 14 men and 21 women screened at baseline, 6 months and 12 months. From baseline to 6 months there was a fall in mean plasma homocysteine of over 2|mol/L (P = 0.006) but no further change thereafter (P = 0.433). These changes were associated with a significant increase in red cell folate concentration from baseline to 6 months (P < 0.001) and a further increase from 6 to 12 months (P < 0.001). In multiple regression analysis, change in homocysteine concentration from baseline to 6 months was predicted by change in red cell folate (P = 0.002) and baseline homocysteine (P < 0.001) concentrations, but not by age, gender or baseline red cell folate concentration. We conclude that modest improvements in dietary quality among populations with poor nutrition (and limited disposable income) can lead to reductions in CVD risk.
Resumo:
This paper reports a comparison of the practicality, acceptability and face validity of five dietary intake methods in two remote Australian Aboriginal communities: weighed dietary intake, 24‐hour recall, ‘store‐turnover’, diet history and food frequency methods. The methods used to measure individual dietary intake were poorly accepted by the communities. Quantitative data were obtained only from the first three methods. The 24‐hour recall method tended to produce higher nutrient intakes than the weighed intake method and certain foods appeared to be selectively recalled according to perceived nutritional desirability. The ‘store‐turnover’ method was most acceptable to the communities and had less potential for bias than the other methods. It was also relatively objective, non‐intrusive, rapid, easy and inexpensive. However, food distribution patterns within the communities could not be assessed by this method. Nevertheless, other similarly isolated communities may benefit by use of the ‘store‐turnover’ method.
Resumo:
Apparent per capita food and nutrient intake in six remote Australian Aboriginal communities using the ‘store-turnover’ method is described. The method is based on the analysis of community-store food invoices. The face validity of the method supports the notion that, under the unique circumstances of remote Aboriginal communities, the turnover of foodstuffs from the community store is a useful measure of apparent dietary intake for the community as a whole. In all Aboriginal communities studied, the apparent intake of energy, sugars and fat was excessive, while the apparent intake of dietary fibre and several nutrients, including folic acid, was low. White sugar, flour, bread and meat provided in excess of 50 per cent of the apparent total energy intake. Of the apparent high fat intake, fatty meats contributed nearly 40 per cent in northern coastal communities and over 60 per cent in central desert communities. Sixty per cent of the apparent high intake of sugars was derived from sugar per se in both regions. Compared with national Australian apparent consumption data, intakes of sugar, white flour and sweetened carbonated beverages were much higher in Aboriginal communities, and intakes of wholemeal bread, fruit and vegetables were much lower. Results of the store-turnover method have important implications for community-based nutrition intervention programs.
Resumo:
This paper summarises the development and testing of the 'store-turnover' method, a non-invasive dietary survey methodology for quantitative measurement of food and nutrient intake in remote, centralised Aboriginal communities. It then describes the use of the method in planning, implementation and evaluation of a community-based nutrition intervention project in a small Aboriginal community in the Northern Territory. During this project marked improvements in both the dietary intake of the community and biological indicators of nutritional health (including vitamin status and the degree and prevalence of several risk factors for non-communicable disease) were measured in the community over a 12-month period following the development of intervention strategies with the community. Although these specific strategies are presented, emphasis is directed towards the process involved, particularly the evaluation procedures used to monitor all stages of the project with the community.
Resumo:
Cognitive impairment and physical disability are common in Parkinson’s disease (PD). As a result diet can be difficult to measure. This study aimed to evaluate the use of a photographic dietary record (PhDR) in people with PD. During a 12-week nutrition intervention study, 19 individuals with PD kept 3-day PhDRs on three occasions using point-and-shoot digital cameras. Details on food items present in the PhDRs and those not photographed were collected retrospectively during an interview. Following the first use of the PhDR method, the photographer completed a questionnaire (n=18). In addition, the quality of the PhDRs was evaluated at each time point. The person with PD was the sole photographer in 56% of the cases, with the remainder by the carer or combination of person with PD and the carer. The camera was rated as easy to use by 89%, keeping a PhDR was considered acceptable by 94% and none would rather use a “pen and paper” method. Eighty-three percent felt confident to use the camera again to record intake. Of the photos captured (n=730), 89% were of adequate quality (items visible, in-focus), while only 21% could be used alone (without interview information) to assess intake. Over the study, 22% of eating/drinking occasions were not photographed. PhDRs were considered an easy and acceptable method to measure intake among individuals with PD and their carers. The majority of PhDRs were of adequate quality, however in order to quantify intake the interview was necessary to obtain sufficient detail and capture missing items.
Resumo:
Access to dietetic care is important in chronic disease management and innovative technologies assists in this purpose. Photographic dietary records (PhDR) using mobile phones or cameras are valid and convenient for patients. Innovations in providing dietary interventions via telephone and computer can also inform dietetic practice. Three studies are presented. A mobile phone method was validated by comparing energy intake (EI) to a weighed food record and a measure of energy expenditure (EE) obtained using the doubly labelled water technique in 10 adults with T2 diabetes. The level of agreement between mean (±sd) energy intake mobile phone (8.2±1.7 MJ) and weighed record (8.5±1.6 MJ) was high (p=0.392), however EI/EE for both methods gave similar levels of under-reporting (0.69 and 0.72). All subjects preferred using the mobile phone vs. weighed record. Nineteen individuals with Parkinsons disease kept 3-day PhDRs on three occasions using point-and-shoot digital cameras over a 12 week period. The camera was rated as easy to use by 89%, keeping a PhDR was considered acceptable by 94% and none would rather use a “pen and paper” method. Eighty-three percent felt confident to use the camera again to record intake. An interactive, automated telephone system designed to coach people with T2 diabetes to adopt and maintain diabetes self-care behaviours, including nutrition, showed trends for improvements in total fat, saturated fat and vegetable intake of the intervention group compared to control participants over 6 months. Innovative technologies are acceptable to patients with chronic conditions and can be incorporated into dietetic care.