862 resultados para total dietary fiber
Resumo:
Aim. To assess the relationships between dietary factors and colorectal cancer risk. ^ Methods. We looked at all the systematic reviews published in last ten years on the topic. ^ Results. For fruits-vegetables some studies1 were significant for heterogeneity and others2 were not. In study by Aune at al3 only fruits were significant, although all the studies had protective RR between 0.90 to 0.94. For folate only case-control group of studies, the study by Sanjoaquin et al4 was significant with p heterogeneity being 0.01 and all of them had protective effect with RR between 0.75 to 0.95, for dietary as well as total folate. For fiber study by Park et al5 p was insignificant at 0.14 an RR was 0.84. Vitamin B6 study by Larsson et al6 had significant p with RR 0.90. For dietary fat both Alexander7 and Liu8 concluded that there is insufficient evidence that dietary fat is an independent causative risk factor. Only one study by Norat et al9 out of three was able to achieve significant p heterogeneity for meat. All the studies reported RR between 1.14 to 1.35, clearly implicating meat as culprit for increasing the risk of colorectal cancer. ^ Conclusions. We would recommend the use of fruits and vegetables to be protective against colorectal cancer. Also meat consumption increases the risk of colorectal cancer.^ *Please refer to dissertation for references/footnotes.^
Resumo:
Los objetivos globales de esta tesis han sido estudiar el efecto que los carbohidratos de la dieta ejercen sobre los rendimientos productivos, la barrera intestinal, y la digestión de animales destetados a 25 días de edad. Además se ha estudiado cuál es el mejor periodo para determinar la digestibilidad fecal tras el destete a esta edad. En el primer experimento se estudió el efecto de la fibra neutro detergente soluble (FNDS) sobre la barrera intestinal, digestión, microbiota intestinal y rendimientos productivos de gazapos en gazapos en la fase post-destete. Se diseñaron tres piensos isonutritivos en los que la única fuente de variación fueron los niveles de fibra soluble. A partir de una dieta control (AH) con 103 g/kg de materia seca de FNDS y alfalfa como fuente principal de fibra, se sustituyó la mitad de esta alfalfa por una mezcla de pulpa de remolacha y pulpa de manzana (75:25) en el pienso B-AP y por una mezcla de cascarilla y concentrado de proteína de soja (88:12) en el pienso OH, obteniéndose 131 y 79 g/kg de FNDS sobre materia seca, respectivamente. Los conejos se destetaron a 25 días y fueron alimentados con los piensos experimentales hasta los 35 días de edad, momento en el que se sacrificaron para la determinación de la digestibilidad ileal aparente (DIA) de la materia seca (MS), proteína bruta (PB) y almidón, la morfología de la mucosa, y actividad enzimática en el yeyuno, el tejido linfoide asociado a la mucosa, así como la microbiota intestinal. Para la determinación de la morfología de la mucosa se utilizaron adicionalmente 19 animales lactantes de 35 días de edad. Para el estudio de la tasa de mortalidad, se utilizaron 118 animales más por tratamiento que recibieron los piensos experimentales durante las dos semanas post-destete y posteriormente un pienso comercial hasta los 60 días de edad. Los animales recibieron durante todo el experimento medicación en el agua de bebida (100 ppm de apramicina sulfato y 120 ppm de tilosina tartrato). El nivel de fibra soluble mejoró los parámetros que se utilizaron para la caracterización del estado de la barrera intestinal. Los conejos alimentados con el mayor nivel de FNDS en el pienso presentaron una mayor longitud de los villi (P=0.001), un mayor ratio longitud villi/profundidad de las criptas (8.14; P=0.001), una mayor actividad disacaridásica (8671 μmol de glucosa/g de proteína; P=0.019), así como una mayor digestibilidad ileal (96.8%; P=0.002), observándose una reducción en el flujo ileal de almidón a medida que se incrementó el nivel de fibra soluble en el pienso (1,2 vs 0,5 g/d; P=0.001). Los animales lactantes a 35 días de edad presentaron un ratio longitud de villi/profundidad de las criptas menor que el observado en aquéllos alimentados con el pienso B-AP (6.70), pero superior al de los piensos AH y OH. Niveles inferiores de NDFS tendieron (P=0.074) a incrementar la respuesta inmune de tipo celular (linfocitos CD8+). El pienso también afectó a la producción de IL2 (CD25+; P=0.029; CD5+CD25+; P=0.057), pero sin llegar a establecerse una clara relación con el nivel de fibra soluble. La diversidad de la microbiota intestinal no se vio afectada por el pienso (P ≥ 0.38). Los animales alimentados con las piensos B-AP y AH presentaron una reducción en la frecuencia de detección de Clostridium perfringens tanto en íleon (P=0.062) como en ciego (4.3 vs. 17.6%, P =0.047), comparado con el pienso OH. Además la tasa de mortalidad (118 gazapos/pienso) disminuyó de 14.4% en el pienso OH a 5.1% en el pienso B-AP. Entre los 32 y los 35 días de edad se determinó la digestibilidad fecal aparente (14/pienso) de la materia seca (MS), energía bruta (EB), proteína bruta (PB), fibra neutro detergente (FND), fibra ácido detergente (FAD) y almidón. Este grupo, junto con otros nueve animales por tratamiento se utilizaron para determinar el peso del estómago y el ciego, la concentración cecal de ácidos grasos volátiles (AGV) y amoniaco (NH3), así como las tasas de similitud de la microbiota intestinal. Además se estudiaron los rendimientos productivos (35 animales/tratamiento) de los gazapos durante todo el período de cebo, consumiendo los piensos experimentales desde el destete hasta los 35 días y posteriormente un pienso comercial hasta los 60 días de edad. Niveles crecientes de FNDS mejoraron la digestibilidad fecal de la materia seca (MS) y energía (P<0.001). La inclusión FNDS aumentó de manera lineal el peso del contenido cecal (P=0.001) y el peso del aparato digestivo completo (P=0.008), y en los días previos al sacrificio disminuyó de manera lineal el consumo medio diario (P=0.040). Se observó además, una disminución lineal (P≤0.041) del pH del estómago. No se encontró relación entre el pH, la concentración y proporciones molares de AGV y el nivel de FNDS. El pienso pareció tener un efecto, incluso superior al de la madre, sobre la tasa de similitud de la microbiota, y los efectos fueron mayores a nivel cecal que ileal. La eficacia alimenticia aumentó de manera lineal en un 12% entre piensos extremos tras el destete (25- 39d) y en un 3% en el período global de cebo con niveles mayores de NDFS. El consumo medio diario durante la fase post-destete y durante todo el período de cebo, tendió a aumen tar (P≤0.079) con niveles mayores de FNDS, sin embargo no se apreció efecto sobre la ganancia media diaria (P≥0.15). En conclusión, el incremento del nivel de fibra soluble en el pienso parece resultar beneficioso para la salud del animal ya que mejora la integridad de la mucosa, y reduce la frecuencia de detección de potenciales patógenos como C. perfringens y Campylobacter spp. Conforme a estos resultados, debería tenerse en cuenta el contenido en fibra soluble en la formulación de piensos de conejos en la fase post-destete. El objetivo del segundo experimento fue determinar el efecto de la fuente de almidón sobre la digestión, la microbiota intestinal y los rendimientos productivos en conejos destetados con 25 días de edad. Se formularon tres piensos isonutritivos en los que se modificaron las principales fuentes de almidón: trigo crudo, trigo cocido y una combinación de trigo y arroz cocido. Dos grupos de 99 y 193 animales se destetaron con 25 días de edad. El primero de ellos se utilizó para la determinación de los parámetros productivos conforme al mismo protocolo seguido en el experimento anterior. El segundo de los grupos se utilizó para la determinación de la digestibilidad fecal de 32 a 35 d, la digestibilidad ileal aparente (DIA) a 35 d, la morfología de la mucosa intestinal, los parámetros de fermentación cecal; así como, la caracterización de la microbiota intestinal. Se utilizaron además dos grupos adicionales de animales 384 (medicados) y 177 (no medicados) para estudiar el efecto de la suplementación con antibióticos en el agua de bebida sobre la mortalidad. El procesado térmico del trigo mejoró ligeramente la digestibilidad ileal del almidón (P=0.020) pero no modificó el flujo final de almidón que alcanzó el ciego, observándose una mayor frecuencia de detección de Campylobacter spp. y Ruminococcus spp. en ciego (P≤0.023), pero sin cambios a nivel ileal. El procesado térmico del trigo no afectó tampoco a los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal o la morfología de la mucosa. La sustitución parcial del trigo cocido por arroz cocido, penalizó la digestibilidad ileal del almidón (P=0.020) e incrementó el flujo ileal de este nutriente al ciego (P=0.007). Sin embargo no afectó a la mortalidad, pese a que se detectaron cambios en la microbiota tanto a nivel ileal como cecal, disminuyendo la frecuencia de detección de Campylobacter spp. (en íleon y ciego), Helicobacter spp. (en íleon) y Ruminococcus spp (en ciego) e incrementando Bacteroides spp. (en ciego) (P≤0.046). El empleo de arroz cocido en las piensos post-destete no tuvieron efectos sobre los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal a excepción del almidón, o la morfología de la mucosa. La suplementación con antibiótico redujo la fre cuencia de detección de la mayoría de las bacterias estudiadas (P≤0.048), sobre todo para Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), observándose un efecto mayor a nivel ileal que cecal, lo que se asoció a la bajada significativa (P<0.001) de la mortalidad. En conclusión, los resultados de este experimento indican que la fuente de almidón afecta a la microbiota intestinal pero no influiye sobre la salud del animal. En relación al procesado, el uso de trigo cocido junto con arroz cocido no mejora los resultados obtenidos con trigo duro, si bienserían necesarios más experimentos que confirmaran este punto. El último de los experimentos se centró en un aspecto metodológico. Dado que, los conejos destetados presentan un patrón digestivo diferente al de un animal adulto resultado de su inmadurez digestiva, el objetivo buscado era tratar de determinar el mejor procedimiento para la determinación de la digestibilidad fecal en los gazapos en la fase post-destete. Para tal fin se utilizaron 15 animales/tratamiento de tres camadas diferentes que se destetaron con 25 días, suministrándoles un pienso comercial de crecimiento-cebo. Se registró el consumo medio diario y la excreción diaria de heces desde el día 25 hasta el día 40 de edad para la determinación de la digestibilidad de la MS. La camada afectó al consumo medio diario y la excreción de heces (P=0.013 y 0.014, respectivamente), observándose una tendencia (P=0.061) en la digestibilidad. La edad afectó (P<0.001) a todos estos factores, incrementándose de manera más evidente la excreción que la ingestión de materia seca en la primera semana de vida, para aumentar de forma paralela a partir de la segunda. La correlación entre el consumo medio diario fue mayor con la excreción de heces del mismo día que con la del día siguiente, por lo que se utilizó el primero para la determinación de la digestibilidad de la MS (MSd). La MSd disminuyó de manera lineal hasta los 32 días de edad (2.17±0.25 unidades porcentuales por día), mientras que permaneció constante desde los 32 a los 40 días (69.4±0.47%). Por otro lado, la desviación estándar de la MSd se redujo cuando se incrementó el período de recogida de 2 a 6 días en un 54%. Conforme a los resultados obtenidos, se puede concluir que no es aconsejable comenzar las pruebas de digestibilidad antes de los 32 días de edad y que el número de animales necesario para detectar diferencias significativas entre tratamientos dependerá del período de recogida de heces. ABSTRACT The global aim of this thesis has been to study the effect of dietary carbohydrates on growth, performance, digestion and intestinal barrier in 25-d weaned rabbits. In addition there has also been studied which is the best period to determine the fecal digestibility after weaning. The first experiment focused on the effect of Neutral Detergent Soluble Fibre (NDSF) on gut barrier function, digestion, intestinal microbiota and growth performance n rabbits in the post-weaning period. Three isonutritive diets which only varied in the levels of soluble fiber were formulated such as it described as follows: a control diet (AH) containing 103 g of neutral detergent soluble fiber, including alfalfa as main source of fiber, was replaced by a mixture of beet and apple pulp (75-25) in the B-AP diet and, by a mix of oat hulls and soybean protein concentrate (88:12) in the OH diet, resulting 131 and 79 g of NDFS/kg of dry matter, respectively. Rabbits, weaned at 25 days of age, were fed the experimental diets up to 35 days of age, moment in which they were slaughtered for apparent ileal digestibility (AID) of dry matter (DM), crude protein (CP) and starch, mucosa morphology, sucrose activity, characterization of lamina propria lymphocytes and intestinal microbiota. To assess mucosal morphology, 19 suckling 35-d-old rabbits were also used. For mortality study, besides these animals, 118 additional rabbits per treatment were fed the experimental diets for two weeks period and thereafter received a commercial diet until 60 days of age. Rabbits were water medicated during the whole experimental period (100 ppm de apramicine sulphate and 120 ppm of tylosine tartrate). Level of soluble fiber improved all the parameters used for the characterization of the intestinal barrier condition. Villous height of the jejunal mucosa increased with dietary soluble fiber (P=0.001). Villous height of jejunal mucosa increased with dietary soluble fiber (P = 0.001). Rabbits fed the highest level of soluble fiber (BA-P diet) showed the highest villous height/crypth depth ratio (8.14; P = 0.001), sucrase specific activity (8671 μmol glucose/ g protein; P = 0.019), and the greatest ileal starch digestibility (96.8%; P = 0.002). The opposite effects were observed in rabbits fed decreased levels of soluble fiber (AH and OH diets; 4.70, 5,848 μmol of glucose/g of protein, as average, respectively). The lowest ileal starch digestibility was detected for animal fed OH diet (93.2%). Suckling rabbits of the same age showed a lower villous height/crypt depth ratio (6.70) compared with the B-AP diet group, but this ration was higher that the AH or OH diet groups. Lower levels of soluble fiber tended (P = 0.074) to increase the cellular immune response (CD8+ lymphocytes). Diet affected IL-2 production (CD25+, P = 0.029; CD5+CD25+, P = 0.057), with no clear relationship between soluble fiber and IL-2. The intestinal microbiota biodiversity was not affected by diets (P ≥ 0.38). Animals fed B-AP and AH diets had a reduced cecal frequency of detection compatible with Campylobacter spp. (20.3 vs. 37.8, P = 0.074), and Clostridium perfringens (4.3 vs. 17.6%, P = 0.047), compared with the OH diet group. Moreover, the mortality rates decreased from 14.4 (OH diet) to 5.1% (B-AP diet) with the increased presence of soluble fiber in the diet. Between 32 and 35 days of age, faecal apparent digestibility of dry matter (DM), gross energy (GE), crude protein (CP), neutral detergent fiber (NDF), acid detergent fiber (ADF) and starch was determined (14/diet). This group, plus another nine rabbits/diet were used to determine weight of stomach and caecum and their contents, cecal fermentation traits and similarity rate (SR) of intestinal microbiota. Besides, growth performance parameters (35 rabbits/diet) were studied during the whole fattening period, in animals consuming the experimental feed after the weaning up to 35 days of age and later on a commercial diet up animals reached 60 days of age. Increasing levels of Neutral Detergent Soluble Fiber improved faecal dry matter and energy digestibility (P<0.001). NDSF inclusion improved linearly weight of the caecal content (P=0.001) and the total gastrointestinal tract (P=0.008), and in the previous days to slaughter a linear decrease of daily feed intake in diet with highest level of soluble fiber was also observed. Stomach pH decreased linearly with increasing levels of NDFS (P≤0.041). No relation between NDSF level on pH, concentration and molar proportion of VFA was found. Treatments appeared to influence the similarity rate of microbiota, even higher to mother effect. These effects were higher in ileum than in caecum. A linear positive effect of feed efficiency was observed, which increased around 12% in the two weeks post-weaning (25-39d) and 3% in the whole fattening period between extreme diets with highest levels of soluble fiber. Average daily feed intake during the two weeks after weaning and in the whole fattening period, tended (P≤0.079) to increase with highest levels of NDSF; although there were no effect on daily weight gain (≥0.15). In conclusion, an increase of soluble fiber in the feed seems to be beneficial for animal health, due to improve mucose integrity and reduce detection frequency of those poten tial pathogens like C. perfringens and Campylobacter spp. According to these results, level of soluble fiber should be taking care in feed rabbit formulation in the post-weaning period. The objective of the second experiment was to determine the effect of source of starch on digestion, intestinal microbiota and growth performance in twenty-five-day old weaned rabbits. To accomplish with this aim three iso-nutritive diets were formulated with different source of starch: raw wheat, boiled wheat and a combination of boiled wheat and boiled rice. Two groups of 99 and 193 rabbits were weaned at 25 days of age. The first group was used for growth performance determination following the same protocol than in previous experiment. The second group was used to determine faecal digestibility from 32 to 35 d, apparent ileal digestibility (AID) at 35 d, jejunal mucosa morphology, caecal fermentation traits and characterization of intestinal microbiota. For mortality, two additional groups of 384 (medicated) and 177 (not medicated) were used in order to study the effect of antibiotic water supply supplementation. Heat processing of starch slightly improved ileal digestibility of starch (P=0.020) but did not modify the flow of starch to the caecum. An increase in frequency of detection of Campylobacter spp. y Ruminococcus spp. was observed in the caecum (P≤0.023), with no changes at ileal level. Heat processing of wheat did not modify growth performance, mortality, ileal or faecal digestibility and mucosa morphology. Partial substitution of boiled wheat for boiled rice in the diet impaired ileal starch digestibility (P=0.020) and increased the ileal flow of this nutrient to the caecum (P=0.007). However, it did not affect mortality rate, although changes in the ileal and caecal intestinal microbiota were detected, decreasing the frequency of detection of Campylobacter spp. (both ileum and caecum), Helicobacter spp. (at ileum) and Ruminococcus spp (at caecum) and increasing the Bacteroides spp. (at caecum) (P≤0.046). The effect of boiled rice supplementation did not alter growth performance, mortality, ileal or faecal digestibility of other nutrients than starch, and mucosa morphology. Medication of rabbits reduced the ileal frequency of detection of most bacteria studied (P≤0.048), especially for Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), resulting the effect higher at ileal than caecal level and relating it with a strong reduction of mortality rate (P<0.001). In conclusion, the results of this experiment make think that the source of starch affects the intestinal microbiota but they do not seem to influence animal health. In relation to the effect of heat processed the use of cooked wheat or cooked rice it does not seem to im prove the results obtained with hard wheat, but there would be necessary more experiments that were confirming this point. The last experiment focused on a methodological aspect. Considering that, weaned rabbits have a different digestive pattern than older animals due to their digestive immaturity; the fixed objective was to determine the best procedure for faecal digestibility determination in young rabbits in the post-weaning period. Fifteen rabbits from 5 different litters were weaned at 25 days of age and fed with a commercial feed. Feed intake and faeces excretion were recorded daily from 25 to 40 days of age for dry matter digestibility (DMd) determination. Litter affected daily DM intake and excretion (P=0.013 y 0.014, respectively) and tended to affect DMd (P=0.061). Age affected all these factors (P<0.001), but ingestion increased slowly than dry matter excretion during the first week buth they evolved similarly in the second week. The correlation between daily feed intakes was higher with the faeces excretion of the day than with faeces excretion of the next day, and the first values were used to determine daily DMd. The DMd decreased linearly from weaning to 32 d of age (2.17±0.25 percentage units per day), whereas from 32 to 40 d remained constant (69.4±0.47%). On the other hand, average standard deviation of DMd decreased by 54% when the length of collection period increased from 2 to 6d. Consequently to the obtained results, it could be concluded that it would not be advisable to start digestibility trials before the 32 days of age and that the number of animals required to detect a significant difference among means would depend on the collection period.
Resumo:
Objectives: To estimate the efficacy of dietary advice to lower blood total cholesterol concentration in free-living subjects and to investigate the efficacy of different dietary recommendations.
Resumo:
Diet and physical activity patterns have been implicated as major factors in the increasing prevalence of childhood and adolescent obesity. It is estimated that between 16 and 33 percent of children and adolescents in the United States are overweight (CDC, 2000). Moreover, the CDC estimates that less than 50% of adolescents are physically active on a regular basis (CDC, 2003). Interventions must be focused to modify these behaviors. Facilitating the understanding of proper nutrition and need for physical activity among adolescents is the first step in preventing overweight and obesity and delaying the development of chronic diseases later in life (Dwyer, 2000). The purpose of this study was to compare the outcomes of students receiving one of two forms of education (both emphasizing diet and physical activity), to determine whether a computer based intervention (CBI) program using an interactive, animated CD-ROM would elicit a greater behavior change in comparison to a traditional didactic intervention (TDI) program. A convenience sample of 254 high school students aged 14-19 participated in the 6-month program. A pre-test post-test design was used, with follow-up measures taken at three months post-intervention. ^ No change was noted in total fat, saturated fat, fruit/vegetables, or fiber intake for any of the groups. There was also no change in perceived self-efficacy or perceived social support. Results did, however, indicate an increase in nutrition knowledge for both intervention groups (p<0.001). In addition, the CBI group demonstrated more positive and sustained behavior changes throughout the course of the study. These changes included a decrease in BMI (ppre/post<0.001, ppost/follow-up<0.001), number of meals skipped (ppre/post<0.001), and soda consumption (ppre/post=0.003, ppost/follow-up=0.03) and an increase in nutrition knowledge (ppre/post<0.001, ppre/follow-up <0.001), physical activity (ppre/post<0.05, p pre/follow-up<0.01), frequency of label reading (ppre/follow-up <0.0l) and in dairy consumption (ppre/post=0.03). The TDI group did show positive gains in some areas post intervention, however a return to baseline behavior was shown at follow-up. Findings of this study suggest that compared to traditional didactic teaching, computer-based nutrition and health education has greater potential to elicit change in knowledge and behavior as well as promote maintenance of the behavior change over time. ^
Resumo:
Dyslipidemia is a major public health problem, and therefore, it is important to develop dietary strategies to diminish the prevalence of this disorder. It was recently reported that diet may play an important role in triggering insulin resistance by interacting with genetic variants at the CAPN10 gene locus in patients with metabolic syndrome. Nonetheless, it remains unknown whether genetic variants of genes involved in the development of type 2 diabetes are associated with variations in high-density lipoprotein cholesterol (HDL-C). The study used a single-center, prospective, cohort design. Here, we assessed the effect of four variants of the CAPN10 gene on HDL-C levels in response to a soy protein and soluble fiber dietary portfolio in subjects with dyslipidemia. In 31 Mexican dyslipidemic individuals, we analyzed four CAPN10 gene variants (rs5030952, rs2975762, rs3792267, and rs2975760) associated with type 2 diabetes. Subjects with the GG genotype of the rs2975762 variant of the CAPN10 gene were better responders to dietary intervention, showing increased HDL-C concentrations from the first month of treatment. HDL-C concentrations in participants with the wild type genotype increased by 17.0%, whereas the HDL-C concentration in subjects with the variant genotypes increased by only 3.22% (p = 0.03); the low-density lipoprotein cholesterol levels of GG carriers tended to decrease (-12.6%). These results indicate that Mexican dyslipidemic carriers of the rs2975762-GG genotype are better responders to this dietary intervention.
Resumo:
Background Takeaway consumption has been increasing and may contribute to socioeconomic inequalities in overweight/obesity and chronic disease. This study examined socioeconomic differences in takeaway consumption patterns, and their contributions to dietary intake inequalities. Method Cross-sectional dietary intake data from adults aged between 25 and 64 years from the Australian National Nutrition Survey (n= 7319, 61% response rate). Twenty-four hour dietary recalls ascertained intakes of takeaway food, nutrients and fruit and vegetables. Education was used as socioeconomic indicator. Data were analysed using logistic regression and general linear models. Results Thirty-two percent (n = 2327) consumed takeaway foods in the 24 hour period. Lower-educated participants were less likely than their higher-educated counterparts to have consumed total takeaway foods (OR 0.64; 95% CI 0.52, 0.80). Of those consuming takeaway foods, the lowest-educated group was more likely to have consumed “less healthy” takeaway choices (OR 2.55; 95% CI 1.73, 3.77), and less likely to have consumed “healthy” choices (OR 0.52; 95% CI 0.36, 0.75). Takeaway foods made a greater contribution to energy, total fat, saturated fat, and fibre intakes among lower than higher-educated groups. Lower likelihood of fruit and vegetable intakes were observed among “less healthy” takeaway consumers, whereas a greater likelihood of their consumption was found among “healthy” takeaway consumers. Conclusions Total and the types of takeaway foods consumed may contribute to socioeconomic inequalities in intakes of energy, total and saturated fats. However, takeaway consumption is unlikely to be a factor contributing to the lower fruit and vegetable intakes among socioeconomically-disadvantaged groups.
Resumo:
Aim: To examine the amount of money spent on food by household income, and to ascertain whether food expenditure mediates the relationship between household income and the purchase of staple foods consistent with Australian dietary guideline recommendations. ----- ----- Methods: In face-to-face interviews (n = 1003, 66.4% response rate), households in Brisbane, Australia were asked about their purchasing choices for a range of staple foods, including grocery items, fruits and vegetables. For each participant, information was obtained about their total weekly household food expenditure, along with their sociodemographic and household characteristics. ----- ----- Results: Household income was significantly associated with food expenditure; participants residing in higher-income households spent more money on food per household member than those from lower-income households. Lower income households were less likely to make food purchasing choices of dietary staples that were consistent with recommendations. However, food expenditure did not attenuate the relationship between household income and the purchase of staple foods consistent with dietary guideline recommendations. ----- ----- Conclusions: The findings suggest that food expenditure may not contribute to income inequalities in purchasing staple foods consistent with dietary guideline recommendations: instead, other material or psychosocial factors not considered in the current study may be more important determinants of these inequalities. Further research should examine whether expenditure on non-staple items and takeaway foods is a larger contributor to socioeconomic inequalities in dietary behavior.
Resumo:
This Review examined socioeconomic inequalities in intakes of dietary factors associated with weight gain, overweight/obesity among adults in Europe. Literature searches of studies published between 1990 and 2007 examining socioeconomic position (SEP) and the consumption of energy, fat, fibre, fruit, vegetables, energy-rich drinks and meal patterns were conducted. Forty-seven articles met the inclusion criteria. The direction of associations between SEP and energy intakes were inconsistent. Approximately half the associations examined between SEP and fat intakes showed higher total fat intakes among socioeconomically disadvantaged groups. There was some evidence that these groups consume a diet lower in fibre. The most consistent evidence of dietary inequalities was for fruit and vegetable consumption; lower socioeconomic groups were less likely to consume fruit and vegetables. Differences in energy, fat and fibre intakes (when found) were small-to-moderate in magnitude; however, differences were moderate-to-large for fruit and vegetable intakes. Socioeconomic inequalities in the consumption of energy-rich drinks and meal patterns were relatively under-studied compared with other dietary factors. There were no regional or gender differences in the direction and magnitude of the inequalities in the dietary factors examined. The findings suggest that dietary behaviours may contribute to socioeconomic inequalities in overweight/obesity in Europe. However, there is only consistent evidence that fruit and vegetables may make an important contribution to inequalities in weight status across European regions.
Resumo:
Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.
Resumo:
Traditional treatments for weight management have focussed on prescribed dietary restriction or regular exercise, or a combination of both. However recidivism for such prescribed treatments remains high, particularly among the overweight and obese. The aim of this thesis was to investigate voluntary dietary changes in the presence of prescribed mixed-mode exercise, conducted over 16 weeks. With the implementation of a single lifestyle change (exercise) it was postulated that the onerous burden of concomitant dietary and exercise compliance would be reduced, leading to voluntary lifestyle changes in such areas as diet. In addition, the failure of exercise as a single weight loss treatment has been reported to be due to compensatory energy intakes, although much of the evidence is from acute exercise studies, necessitating investigation of compensatory intakes during a long-term exercise intervention. Following 16 weeks of moderate intensity exercise, 30 overweight and obese (BMI≥25.00 kg.m-2) men and women showed small but statistically significant decreases in mean dietary fat intakes, without compensatory increases in other macronutrient or total energy intakes. Indeed total energy intakes were significantly lower for men and women following the exercise intervention, due to the decreases in dietary fat intakes. There was a risk that acceptance of the statistical validity of the small changes to dietary fat intakes may have constituted a Type 1 error, with false rejection of the Null hypothesis. Oro-sensory perceptions to changes in fat loads were therefore investigated to determine whether the measured dietary fat changes were detectable by the human palate. The ability to detect small changes in dietary fat provides sensory feedback for self-initiated dietary changes, but lean and overweight participants were unable to distinguish changes to fat loads of similar magnitudes to that measured in the exercise intervention study. Accuracy of the dietary measurement instrument was improved with the effects of random error (day-to-day variability) minimised with the use of a statistically validated 8-day, multiple-pass, 24 hour dietary recall instrument. However systematic error (underreporting) may have masked the magnitude of dietary change, particularly the reduction in dietary fat intakes. A purported biomarker (plasma Apolipoprotein A-IV) (apoA-IV) was subsequently investigated, to monitor systematic error in self-reported dietary intakes. Changes in plasma apoA-IV concentrations were directly correlated with increased and decreased changes to dietary fat intakes, suggesting that this objective marker may be a useful tool to improve the accuracy of dietary measurement in overweight and obese populations, who are susceptible to dietary underreporting.
Resumo:
A routine activity for a sports dietitian is to estimate energy and nutrient intake from an athlete's self-reported food intake. Decisions made by the dietitian when coding a food record are a source of variability in the data. The aim of the present study was to determine the variability in estimation of the daily energy and key nutrient intakes of elite athletes, when experienced coders analyzed the same food record using the same database and software package. Seven-day food records from a dietary survey of athletes in the 1996 Australian Olympic team were randomly selected to provide 13 sets of records, each set representing the self-reported food intake of an endurance, team, weight restricted, and sprint/power athlete. Each set was coded by 3-5 members of Sports Dietitians Australia, making a total of 52 athletes, 53 dietitians, and 1456 athlete-days of data. We estimated within- and between- athlete and dietitian variances for each dietary nutrient using mixed modeling, and we combined the variances to express variability as a coefficient of variation (typical variation as a percent of the mean). Variability in the mean of 7-day estimates of a nutrient was 2- to 3-fold less than that of a single day. The variability contributed by the coder was less than the true athlete variability for a 1-day record but was of similar magnitude for a 7-day record. The most variable nutrients (e.g., vitamin C, vitamin A, cholesterol) had approximately 3-fold more variability than least variable nutrients (e.g., energy, carbohydrate, magnesium). These athlete and coder variabilities need to be taken into account in dietary assessment of athletes for counseling and research.
Resumo:
Three native freshwater crayfish Cherax species are farmed in Australia namely; Redclaw (Cherax quadricarinatus), Marron (C. tenuimanus), and Yabby (C. destructor). Lack of appropriate data on specific nutrient requirements for each of these species, however, has constrained development of specific formulated diets and hence current use of over-formulated feeds or expensive marine shrimp feeds, limit their profitability. A number of studies have investigated nutritional requirements in redclaw that have focused on replacing expensive fish meal in formulated feeds with non-protein, less expensive substitutes including plant based ingredients. Confirmation that freshwater crayfish possess endogenous cellulase genes, suggests their potential ability to utilize complex carbohydrates like cellulose as nutrient sources in their diet. To date, studies have been limited to only C. quadricarinatus and C. destructor and no studies have compared the relative ability of each species to utilize soluble cellulose in their diets. Individual feeding trials of late-juveniles of each species were conducted separately in an automated recirculating culture system over 12 week cycles. Animals were fed either a test diet (TD) that contained 20% soluble cellulose or a reference diet (RD) substituted with the same amount of corn starch. Water temperature, conductivity and pH were maintained at constant and optimum levels for each species. Animals were fed at 3% of their body weight twice daily and wet body weight was recorded bi-weekly. At the end of experiment, all animals were harvested, measured and midgut gland extracts assayed for alpha-amylase, total protease and cellulase activity levels. After the trial period, redclaw fed with RD showed significantly higher (p<0.05) specific growth rate (SGR) compare with animals fed the TD while SGR of marron and yabby fed the two diets were not significantly different (p<0.05). Cellulase expression levels in redclaw were not significantly different between diets. Marron and yabby showed significantly higher cellulase activity when fed the RD. Amylase and protease activity in all three species were significantly higher in the animals fed with RD (Table 1). These results indicate that test animals of all species can utilize starch better than dietary soluble cellulose in their diet and inclusion of 20% soluble cellulose in diets does not appear to have any significant negative effect on their growth rate but survival was impacted in C. quadricarinatus while not in C. tenuimanus or C. destructor.
Resumo:
Background & aims: - Excess adiposity (overweight) is one of numerous risk factors for cardiometabolic disease. Most risk reduction strategies for overweight rely on weight loss through dietary energy restriction. However, since the evidence base for long-term successful weight loss interventions is scant, it is important to identify strategies for risk reduction independent of weight loss. The aim of this study was to compare the effects of isoenergetic substitution of dietary saturated fat (SFA) with monounsaturated fat (MUFA) via macadamia nuts on coronary risk compared to usual diet in overweight adults. Methods: - A randomised controlled trial design, maintaining usual energy intake, but manipulating dietary lipid profile in a group of 64 (54 female, 10 male) overweight (BMI > 25), otherwise healthy, subjects. For the intervention group, energy intakes of usual (baseline) diets were calculated from multiple 3 day diet diaries, and SFA was replaced with MUFA (target: 50%E from fat as MUFA) by altering dietary SFA sources and adding macadamia nuts to the diet. Both control and intervention groups received advice on national guidelines for physical activity and adhered to the same protocol for diet diary record keeping and trial consultations. Anthropometric and clinical measures were taken at baseline and at 10 weeks. Results: A significant increase in brachial artery flow-mediated dilation (p < 0.05) was seen in the monounsaturated diet group at week 10 compared to baseline. This corresponded to significant decreases in waist circumference, total cholesterol (p < 0.05), plasma leptin and ICAM-1 (p < 0.01). Conclusions: - In patient subgroups where adherence to dietary energy-reduction is poor, isoenergetic interventions may improve endothelial function and other coronary risk factors without changes in body weight. This trial was registered with the Australia New Zealand Clinical Trial Registry (ACTRN12607000106437).
Resumo:
Background & aims Depression has a complex association with cardiometabolic risk, both directly as an independent factor and indirectly through mediating effects on other risk factors such as BMI, diet, physical activity, and smoking. Since changes to many cardiometabolic risk factors involve behaviour change, the rise in depression prevalence as a major global health issue may present further challenges to long-term behaviour change to reduce such risk. This study investigated associations between depression scores and participation in a community-based weight management intervention trial. Methods A group of 64 overweight (BMI > 27), otherwise healthy adults, were recruited and randomised to follow either their usual diet, or an isocaloric diet in which saturated fat was replaced with monounsaturated fat (MUFA), to a target of 50% total fat, by adding macadamia nuts to the diet. Subjects were assessed for depressive symptoms at baseline and at ten weeks using the Beck Depression Inventory (BDI-II). Both control and intervention groups received advice on National Guidelines for Physical Activity and adhered to the same protocol for food diary completion and trial consultations. Anthropometric and clinical measurements (cholesterol, inflammatory mediators) also were taken at baseline and 10 weeks. Results During the recruitment phase, pre-existing diagnosed major depression was one of a range of reasons for initial exclusion of volunteers from the trial. Amongst enrolled participants, there was a significant correlation (R = −0.38, p < 0.05) between BDI-II scores at baseline and duration of participation in the trial. Subjects with a baseline BDI ≥10 (moderate to severe depression symptoms) were more likely to dropout of the trial before week 10 (p < 0.001). BDI-II scores in the intervention (MUFA) diet group decreased, but increased in the control group over the 10-week period. Univariate analysis of variance confirmed these observations (adjusted R2 = 0.257, p = 0.01). Body weight remained static over the 10-week period in the intervention group, corresponding to a relative increase in the control group (adjusted R2 = 0.097, p = 0.064). Conclusions Depression symptoms have the potential to affect enrolment in and adherence to dietbased risk reduction interventions, and may consequently influence the generalisability of such trials. Depression scores may therefore be useful for characterising, screening and allocating subjects to appropriate treatment pathways.
Resumo:
Apparent per capita food and nutrient intake in six remote Australian Aboriginal communities using the ‘store-turnover’ method is described. The method is based on the analysis of community-store food invoices. The face validity of the method supports the notion that, under the unique circumstances of remote Aboriginal communities, the turnover of foodstuffs from the community store is a useful measure of apparent dietary intake for the community as a whole. In all Aboriginal communities studied, the apparent intake of energy, sugars and fat was excessive, while the apparent intake of dietary fibre and several nutrients, including folic acid, was low. White sugar, flour, bread and meat provided in excess of 50 per cent of the apparent total energy intake. Of the apparent high fat intake, fatty meats contributed nearly 40 per cent in northern coastal communities and over 60 per cent in central desert communities. Sixty per cent of the apparent high intake of sugars was derived from sugar per se in both regions. Compared with national Australian apparent consumption data, intakes of sugar, white flour and sweetened carbonated beverages were much higher in Aboriginal communities, and intakes of wholemeal bread, fruit and vegetables were much lower. Results of the store-turnover method have important implications for community-based nutrition intervention programs.