79 resultados para DIETARY RESTRICTION
Resumo:
Scope: We examined whether dietary supplementation with fish oil modulates inflammation, fibrosis and oxidative stress following obstructive renal injury. Methods and results: Three groups of Sprague-Dawley rats (n = 16 per group) were fed for 4 wk on normal rat chow (oleic acid), chow containing fish oil (33 g eicosapentaenoic acid and 26 g docosahexaenoic acid per kg diet), or chow containing safflower oil (60 g linoleic acid per kg diet). All diets contained 7% fat. After 4 wk, the rats were further subdivided into four smaller groups (n = 4 per group). Unilateral ureteral obstruction was induced in three groups (for 4, 7 and 14 days). The fourth group for each diet did not undergo surgery, and was sacrificed as controls at 14 days. When rats were sacrificed, plasma and portions of the kidneys were removed and frozen; other portions of kidney tissue were fixed and prepared for histology. Compared with normal chow and safflower oil, fish oil attenuated collagen deposition, macrophage infiltration, TGF-beta expression, apoptosis, and tissue levels of arachidonic acid, MIP-1 alpha, IL-1 beta, MCP-1 and leukotriene B(4). Compared with normal chow, fish oil increased the expression of HO-1 protein in kidney tissue. Conclusions: Fish oil intake reduced inflammation, fibrosis and oxidative stress following obstructive renal injury.
Resumo:
Objective: Food insecurity is the limited or uncertain availability or access to nutritionally-adequate, culturally-appropriate and safe foods. Food insecurity may result in inadequate dietary intakes, overweight or obesity and the development of chronic disease. Internationally, few studies have focused on the range of potential health outcomes related to food insecurity among adults residing in disadvantaged locations and no such Australian studies exist. The objective of this study was to investigate associations between food insecurity, socio-demographic and health factors and dietary intakes among adults residing in disadvantaged urban areas. Design: Data were collected by mail survey (n= 505, 53% response rate), which ascertained information about food security status, demographic characteristics (such as age, gender, household income, education) fruit and vegetable intakes, take-away and meat consumption, general health, depression and chronic disease. Setting: Disadvantaged suburbs of Brisbane city, Australia, 2009. Subjects: Individuals aged ≥ 20 years. Results: Approximately one-in-four households (25%) were food insecure. Food insecurity was associated with lower household income, poorer general health, increased healthcare utilisation and depression. These associations remained after adjustment for age, gender and household income. Conclusion: Food insecurity is prevalent in urbanised disadvantaged areas in developed countries such as Australia. Low-income households are at high risk of experiencing food insecurity. Food insecurity may result in significant health burdens among the population, and this may be concentrated in socioeconomically-disadvantaged suburbs.
Resumo:
Introduction: Food insecurity is the limited/uncertain availability, access to or ability to acquire nutritionally-adequate, culturallyrelevant and safe foods. Adults suffering from food insecurity are at risk of inadequate nutrient intakes or, paradoxically, overweight/ obesity and the development of chronic disease. Despite the global financial crisis and rising costs of living, there are few studies investigating the potential dietary consequences of food insecurity among the Australian population. This study examined whether food insecurity was associated with weight status and poorer intakes of fruits, vegetable and takeaway foods among adults residing in socioeconomically-disadvantaged urbanised areas. Methods: In this cross-sectional study, a random sample of residents (n=1000) were selected from the most disadvantaged suburbs of Brisbane city (response rate 51%). Data were collected by postal questionnaire which ascertained information on sociodemographic information, household food security status, height, weight, fruit and vegetable intakes and takeaway consumption. Data were analysed using chi-square and logistic regression. Results: The overall prevalence of food insecurity was 31%. Food insecurity was not associated with weight status among men or women. Associations between food security status and potential dietary consequences differed for men and women. Among women, food security was not associated with intakes of fruit, vegetable or takeaway consumption. Contrastingly, among men food security was associated with vegetable intakes and consumption of takeaway food: men reporting food insecurity had lower intakes of vegetables and were more likely to consume takeaway foods compared to those that were food secure. Conclusion: Food security is an important public health issue in Australia and has potential dietary consequences that may adversely affect the health of food-insecure groups, most notably men residing in food-insecure households.
Resumo:
Purpose: Food insecurity is the limited/uncertain availability or ability to acquire nutritionally-adequate, culturally-relevant and safe foods. Adults suffering from food insecurity are at risk of inadequate nutrient intakes or, paradoxically, overweight/obesity and the development of chronic disease. Despite the global financial crisis and rising costs of living, few studies have investigated the potential dietary and health consequences of food insecurity among the Australian population. This study examined whether food insecurity was associated with health behaviours and dietary intakes among adults residing in socioeconomically-disadvantaged urbanised areas. Methods: In this cross-sectional study, a random sample of residents (n = 1000) were selected from the most disadvantaged suburbs of Brisbane city (response rate 51%). Data were collected by postal questionnaire which ascertained information on socio-demographic information, household food security, height, weight, frequency of healthcare utilisation, presence of chronic disease and intakes of fruit, vegetables and take-away. Data were analysed using logistic regression. Results/Findings: The prevalence of food insecurity was 25%. Those reporting food insecurity were two-to-three times more likely to have seen a general practitioner or been hospitalised within the previous 6 months. Furthermore, food insecurity was associated with a three-to-six-fold increase in the likelihood of experiencing depression. Food insecurity was associated with higher intakes of some take-away foods, however was not significantly associated with weight status or intakes of fruits or vegetables among this disadvantaged sample. Conclusion: Food insecurity has potential adverse health consequences that may result in significant health burdens among the population, and this may be concentrated in socioeconomically-disadvantaged suburbs.
Resumo:
Photographic records of dietary intake (PhDRs) are an innovative method for the dietary assessment and may alleviate the burden of recording intake compared to traditional methods of recording intake. While the performance of PhDRs has been evaluated, no investigation into the application of this method had occurre within dietetic practice. This study examined the attitudes of dietitians towards the use of PhDRs in the provision of nutrition care. A web-based survey on the practices and beliefs with regards to technology use among Dietitians Association of Australia members was conducted in August 2011. Of the 87 dietitians who responded, 86% assessed the intakes of clients as part of individualised medical nutrition therapy, with the diet history the most common method used. The majority (91%) of dietitians surveyed believed that a PhDR would be of use in their current practice to estimate intake. Information contained in the PhDR would primarily be used to obtain a qualitative evaluation of diet (84%) or to supplement an existing assessment method (69%), as opposed to deriving an absolute measure of nutrient intake (31%). Most (87%) indicated that a PhDR would also be beneficial in both the delivery of the intervention and to evaluate and monitor goals and outcomes, while only 46% felt that a PhDR would assist in determining the nutrition diagnosis. This survey highlights the potential for the use of PhDRs within practice. Future endeavours lie in establishing resources which support the inclusion of PhDRs within the nutrition care process.
Resumo:
Background & aims: One aim of the Australasian Nutrition Care Day Survey was to determine the nutritional status and dietary intake of acute care hospital patients. Methods: Dietitians from 56 hospitals in Australia and New Zealand completed a 24-h survey of nutritional status and dietary intake of adult hospitalised patients. Nutritional risk was evaluated using the Malnutrition Screening Tool. Participants ‘at risk’ underwent nutritional assessment using Subjective Global Assessment. Based on the International Classification of Diseases (Australian modification), participants were also deemed malnourished if their body mass index was <18.5 kg/m2. Dietitians recorded participants’ dietary intake at each main meal and snacks as 0%, 25%, 50%, 75%, or 100% of that offered. Results: 3122 patients (mean age: 64.6 ± 18 years) participated in the study. Forty-one percent of the participants were “at risk” of malnutrition. Overall malnutrition prevalence was 32%. Fifty-five percent of malnourished participants and 35% of well-nourished participants consumed ≤50% of the food during the 24-h audit. “Not hungry” was the most common reason for not consuming everything offered during the audit. Conclusion: Malnutrition and sub-optimal food intake is prevalent in acute care patients across hospitals in Australia and New Zealand and warrants appropriate interventions.
Resumo:
A routine activity for a sports dietitian is to estimate energy and nutrient intake from an athlete's self-reported food intake. Decisions made by the dietitian when coding a food record are a source of variability in the data. The aim of the present study was to determine the variability in estimation of the daily energy and key nutrient intakes of elite athletes, when experienced coders analyzed the same food record using the same database and software package. Seven-day food records from a dietary survey of athletes in the 1996 Australian Olympic team were randomly selected to provide 13 sets of records, each set representing the self-reported food intake of an endurance, team, weight restricted, and sprint/power athlete. Each set was coded by 3-5 members of Sports Dietitians Australia, making a total of 52 athletes, 53 dietitians, and 1456 athlete-days of data. We estimated within- and between- athlete and dietitian variances for each dietary nutrient using mixed modeling, and we combined the variances to express variability as a coefficient of variation (typical variation as a percent of the mean). Variability in the mean of 7-day estimates of a nutrient was 2- to 3-fold less than that of a single day. The variability contributed by the coder was less than the true athlete variability for a 1-day record but was of similar magnitude for a 7-day record. The most variable nutrients (e.g., vitamin C, vitamin A, cholesterol) had approximately 3-fold more variability than least variable nutrients (e.g., energy, carbohydrate, magnesium). These athlete and coder variabilities need to be taken into account in dietary assessment of athletes for counseling and research.
Resumo:
International research on prisoners demonstrates poor health outcomes, including chronic disease, with the overall burden to the community high. Prisoners are predominantly male and young. In Australia, the average incarceration length is 3 years, sufficient to impact long term health, including nutrition. Food in prisons is highly controlled, yet gaps exist in policy. In most Western countries prisons promote healthy foods, often incongruent with prisoner expectations or wants. Few studies have been conducted on dietary intakes during incarceration in relation to food policy. In this study detailed diet histories were collected on 120/945 men (mean age = 32 years), in a high-secure prison. Intakes were verified via individual purchase records, mealtime observations, and audits of food preparation, purchasing and holdings. Physical measurements (including fasting bloods) were taken and medical records reviewed. Results showed the standard food provided consistent with current dietary guidelines, however limited in menu choice. Diet histories revealed self-funded foods contributing 1–63% of energy (mean = 30%), 0–83% sugar (mean = 38%), 1–77% saturated fats (mean = 31%) and 1–59% sodium (mean = 23%). High levels of modification to food provided was found using minimal cooking amenities and inclusion of self-funded foods and/or foods retained from previous meals. Medical records and physical measurements confirmed markers of chronic disease. This study highlights the need to establish clear guidelines on all food available in prisons if chronic disease risk reduction is a goal. This study has also supported evidenced based food and nutrition policy including menu choice, food quality, quantity and safety as well as type and access to self-funded foods.
Resumo:
Diet Induced Thermogenesis (DIT) is the energy expended consequent to meal consumption, and reflects the energy required for the processing and digestion of food consumed throughout each day. Although DIT is the total energy expended across a day in digestive processes to a number of meals, most studies measure thermogenesis in response to a single meal (Meal Induced Thermogenesis: MIT) as a representation of an individual’s thermogenic response to acute food ingestion. As a component of energy expenditure, DIT may have a contributing role in weight gain and weight loss. While the evidence is inconsistent, research has tended to reveal a suppressed MIT response in obese compared to lean individuals, which identifies individuals with an efficient storage of food energy, hence a greater tendency for weight gain. Appetite is another factor regulating body weight through its influence on energy intake. Preliminary research has shown a potential link between MIT and postprandial appetite as both are responses to food ingestion and have a similar response dependent upon the macronutrient content of food. There is a growing interest in understanding how both MIT and appetite are modified with changes in diet, activity levels and body size. However, the findings from MIT research have been highly inconsistent, potentially due to the vastly divergent protocols used for its measurement. Therefore, the main theme of this thesis was firstly, to address some of the methodological issues associated with measuring MIT. Additionally this thesis aimed to measure postprandial appetite simultaneously to MIT to test for any relationships between these meal-induced variables and to assess changes that occur in MIT and postprandial appetite during periods of energy restriction (ER) and following weight loss. Two separate studies were conducted to achieve these aims. Based on the increasing prevalence of obesity, it is important to develop accurate methodologies for measuring the components potentially contributing to its development and to understand the variability within these variables. Therefore, the aim of Study One was to establish a protocol for measuring the thermogenic response to a single test meal (MIT), as a representation of DIT across a day. This was done by determining the reproducibility of MIT with a continuous measurement protocol and determining the effect of measurement duration. The benefit of a fixed resting metabolic rate (RMR), which is a single measure of RMR used to calculate each subsequent measure of MIT, compared to separate baseline RMRs, which are separate measures of RMR measured immediately prior to each MIT test meal to calculate each measure of MIT, was also assessed to determine the method with greater reproducibility. Subsidiary aims were to measure postprandial appetite simultaneously to MIT, to determine its reproducibility between days and to assess potential relationships between these two variables. Ten healthy individuals (5 males, 5 females, age = 30.2 ± 7.6 years, BMI = 22.3 ± 1.9 kg/m2, %Fat Mass = 27.6 ± 5.9%) undertook three testing sessions within a 1-4 week time period. During the first visit, participants had their body composition measured using DXA for descriptive purposes, then had an initial 30-minute measure of RMR to familiarise them with the testing and to be used as a fixed baseline for calculating MIT. During the second and third testing sessions, MIT was measured. Measures of RMR and MIT were undertaken using a metabolic cart with a ventilated hood to measure energy expenditure via indirect calorimetry with participants in a semi-reclined position. The procedure on each MIT test day was: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard 576 kcal breakfast (54.3% CHO, 14.3% PRO, 31.4% FAT), comprising muesli, milk toast, butter, jam and juice, and 3) six hours of measuring MIT with two, ten-minute breaks at 3 and 4.5 hours for participants to visit the bathroom. On the MIT test days, pre and post breakfast then at 45-minute intervals, participants rated their subjective appetite, alertness and comfort on visual analogue scales (VAS). Prior to each test, participants were required to be fasted for 12 hours, and have undertaken no high intensity physical activity for the previous 48 hours. Despite no significant group changes in the MIT response between days, individual variability was high with an average between-day CV of 33%, which was not significantly improved by the use of a fixed RMR to 31%. The 95% limits of agreements which ranged from 9.9% of energy intake (%EI) to -10.7%EI with the baseline RMRs and between 9.6%EI to -12.4%EI with the fixed RMR, indicated very large changes relative to the size of the average MIT response (MIT 1: 8.4%EI, 13.3%EI; MIT 2: 8.8%EI, 14.7%EI; baseline and fixed RMRs respectively). After just three hours, the between-day CV with the baseline RMR was 26%, which may indicate an enhanced MIT reproducibility with shorter measurement durations. On average, 76, 89, and 96% of the six-hour MIT response was completed within three, four and five hours, respectively. Strong correlations were found between MIT at each of these time points and the total six-hour MIT (range for correlations r = 0.990 to 0.998; P < 0.01). The reproducibility of the proportion of the six-hour MIT completed at 3, 4 and 5 hours was reproducible (between-day CVs ≤ 8.5%). This indicated the suitability to use shorter durations on repeated occasions and a similar percent of the total response to be completed. There was a lack of strong evidence of any relationship between the magnitude of the MIT response and subjective postprandial appetite. Given a six-hour protocol places a considerable burden on participants, these results suggests that a post-meal measurement period of only three hours is sufficient to produce valid information on the metabolic response to a meal. However while there was no mean change in MIT between test days, individual variability was large. Further research is required to better understand which factors best explain the between-day variability in this physiological measure. With such a high prevalence of obesity, dieting has become a necessity to reduce body weight. However, during periods of ER, metabolic and appetite adaptations can occur which may impede weight loss. Understanding how metabolic and appetite factors change during ER and weight loss is important for designing optimal weight loss protocols. The purpose of Study Two was to measure the changes in the MIT response and subjective postprandial appetite during either continuous (CONT) or intermittent (INT) ER and following post diet energy balance (post-diet EB). Thirty-six obese male participants were randomly assigned to either the CONT (Age = 38.6 ± 7.0 years, weight = 109.8 ± 9.2 kg, % fat mass = 38.2 ± 5.2%) or INT diet groups (Age = 39.1 ± 9.1 years, weight = 107.1 ± 12.5 kg, % fat mass = 39.6 ± 6.8%). The study was divided into three phases: a four-week baseline (BL) phase where participants were provided with a diet to maintain body weight, an ER phase lasting either 16 (CONT) or 30 (INT) weeks, where participants were provided with a diet which supplied 67% of their energy balance requirements to induce weight loss and an eight-week post-diet EB phase, providing a diet to maintain body weight post weight loss. The INT ER phase was delivered as eight, two-week blocks of ER interspersed with two-week blocks designed to achieve weight maintenance. Energy requirements for each phase were predicted based on measured RMR, and adjusted throughout the study to account for changes in RMR. All participants completed MIT and appetite tests during BL and the ER phase. Nine CONT and 15 INT participants completed the post-diet EB MIT and 14 INT and 15 CONT participants completed the post-diet EB appetite tests. The MIT test day protocol was as follows: 1) a baseline RMR measured for 30 minutes, 2) a 15-minute break in the measure to consume a standard breakfast meal (874 kcal, 53.3% CHO, 14.5% PRO, 32.2% FAT), and 3) three hours of measuring MIT. MIT was calculated as the energy expenditure above the pre-meal RMR. Appetite test days were undertaken on a separate day using the same 576 kcal breakfast used in Study One. VAS were used to assess appetite pre and post breakfast, at one hour post breakfast then a further three times at 45-minute intervals. Appetite ratings were calculated for hunger and fullness as both the intra-meal change in appetite and the AUC. The three-hour MIT response at BL, ER and post-diet EB respectively were 5.4 ± 1.4%EI, 5.1 ± 1.3%EI and 5.0 ± 0.8%EI for the CONT group and 4.4 ± 1.0%EI, 4.7 ± 1.0%EI and 4.8 ± 0.8%EI for the INT group. Compared to BL, neither group had significant changes in their MIT response during ER or post-diet EB. There were no significant time by group interactions (p = 0.17) indicating a similar response to ER and post-diet EB in both groups. Contrary to what was hypothesised, there was a significant increase in postprandial AUC fullness in response to ER in both groups (p < 0.05). However, there were no significant changes in any of the other postprandial hunger or fullness variables. Despite no changes in MIT in both the CONT or INT group in response to ER or post-diet EB and only a minor increase in postprandial AUC fullness, the individual changes in MIT and postprandial appetite in response to ER were large. However those with the greatest MIT changes did not have the greatest changes in postprandial appetite. This study shows that postprandial appetite and MIT are unlikely to be altered during ER and are unlikely to hinder weight loss. Additionally, there were no changes in MIT in response to weight loss, indicating that body weight did not influence the magnitude of the MIT response. There were large individual changes in both variables, however further research is required to determine whether these changes were real compensatory changes to ER or simply between-day variation. Overall, the results of this thesis add to the current literature by showing the large variability of continuous MIT measurements, which make it difficult to compare MIT between groups and in response to diet interventions. This thesis was able to provide evidence to suggest that shorter measures may provide equally valid information about the total MIT response and can therefore be utilised in future research in order to reduce the burden of long measurements durations. This thesis indicates that MIT and postprandial subjective appetite are most likely independent of each other. This thesis also shows that, on average, energy restriction was not associated with compensatory changes in MIT and postprandial appetite that would have impeded weight loss. However, the large inter-individual variability supports the need to examine individual responses in more detail.
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
The current study evaluated the effect of soluble dietary cellulose on growth, survival and digestive enzyme activity in three endemic, Australian freshwater crayfish species (redclaw: Cherax quadricarinatus, marron: C. tenuimanus, yabby: C. destructor). Separate individual feeding trials were conducted for late-stage juveniles from each species in an automated recirculating freshwater, culture system. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch, over a 12 week period. Redclaw fed with RD showed significantly higher (p<0.05) specific growth rates (SGR) compared with animals fed the TD, while SGR of marron and yabby fed the two diets were not significantly different. Expressed cellulase activity levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD (p<0.05). Amylase and protease activity in all three species were significantly higher in the animals fed with RD (p<0.05). These results indicate that test animals of all three species appear to utilize starch more efficiently than soluble dietary cellulose in their diet. The inclusion of 20% soluble cellulose in diets did not appear, however, to have a significant negative effect on growth rates.
Resumo:
BACKGROUND/OBJECTIVE: To investigate the extent of baseline psychosocial characterisation of subjects in published dietary randomised controlled trials (RCTs) for weight loss. SUBJECTS/METHODS: Systematic review of adequately sized (nX10) RCTs comprising X1 diet-alone arm for weight loss were included for this systematic review. More specifically, trials included overweight (body mass index 425 kg/m2) adults, were of duration X8 weeks and had body weight as the primary outcome. Exclusion criteria included specific psychological intervention (for example, Cognitive Behaviour Therapy (CBT)), use of web-based tools, use of supplements, liquid diets, replacement meals and very-low calorie diets. Physical activity intervention was restricted to general exercise only (not supervised or prescribed, for example, VO2 maximum level). RESULTS: Of 176 weight-loss RCTs published during 2008–2010, 15 met selection criteria and were assessed for reported psychological characterisation of subjects. All studies reported standard characterisation of clinical and biochemical characteristics of subjects. Eleven studies reported no psychological attributes of subjects (three of these did exclude those taking psychoactive medication). Three studies collected data on particular aspects of psychology related to specific research objectives (figure scale rating, satiety and quality-of-life). Only one study provided a comprehensive background on psychological attributes of subjects. CONCLUSION: Better characterisation in behaviour-change interventions will reduce potential confounding and enhance generalisability of such studies.