981 resultados para DIETARY FIBER
Resumo:
Purpose. To determine if self-efficacy (SE) changes predicted total fat (TF) and total fiber (TFB) intake and the relationship between SE changes and the two dietary outcomes. ^ Design. This is a secondary analysis, utilizing baseline and first follow up (FFU) data from the NULIFE, a randomized trial. ^ Setting. Nutrition classes were taught in the Texas Medical Center in Houston, Texas. ^ Participants. 79 pre-menopausal, 25--45 year old African American women with an 85% response rate at FFU. ^ Method. Dietary intake was assessed with the Arizona Food Frequency Questionnaire and SE with the Self Efficacy for Dietary Change Questionnaire. Analysis was done using Stata version 9. Linear and logistic regression was used with adjustment for confounders. ^ Results. Linear regression analyses showed that SE changes for eating fruits and vegetables predicted total fiber intake in the control group for both the univariate (P = 0.001) and multivariate (P = 0.01) models while SE for eating fruits and vegetables at first follow-up predicted total fiber intake in the intervention for both models (P = 0.000). Logistic regression analyses of low fat SE changes and 30% or less for total fat intake, showed an adjusted OR of 0.22 (95% CI = 0.03, 1.48; P = 0.12) in the intervention group. The logistic regression analyses of SE changes in fruits and vegetables and 10g or more for total fiber intake, showed an adjusted OR of 6.25 (95% CI = 0.53, 72.78; P = 0.14) in the control group. ^ Conclusion. SE for eating fruits and vegetables at first follow-up predicted intervention groups' TFB intake and intervention women that increased their SE for eating a low fat diet were more likely to achieve the study goal of 30% or less calories from TF. SE changes for eating fruits and vegetables predicted the control's TFB intake and control women that increased their SE for eating fruits and vegetables were more likely to achieve the study goal of 10 g or more from TFB. Limitations are use of self-report measures, small sample size, and possible control group contamination.^
Resumo:
Introduction. Food frequency questionnaires (FFQ) are used study the association between dietary intake and disease. An instructional video may potentially offer a low cost, practical method of dietary assessment training for participants thereby reducing recall bias in FFQs. There is little evidence in the literature of the effect of using instructional videos on FFQ-based intake. Objective. This analysis compared the reported energy and macronutrient intake of two groups that were randomized either to watch an instructional video before completing an FFQ or to view the same instructional video after completing the same FFQ. Methods. In the parent study, a diverse group of students, faculty and staff from Houston Community College were randomized to two groups, stratified by ethnicity, and completed an FFQ. The "video before" group watched an instructional video about completing the FFQ prior to answering the FFQ. The "video after" group watched the instructional video after completing the FFQ. The two groups were compared on mean daily energy (Kcal/day), fat (g/day), protein (g/day), carbohydrate (g/day) and fiber (g/day) intakes using descriptive statistics and one-way ANOVA. Demographic, height, and weight information was collected. Dietary intakes were adjusted for total energy intake before the comparative analysis. BMI and age were ruled out as potential confounders. Results. There were no significant differences between the two groups in mean daily dietary intakes of energy, total fat, protein, carbohydrates and fiber. However, a pattern of higher energy intake and lower fiber intake was reported in the group that viewed the instructional video before completing the FFQ compared to those who viewed the video after. Discussion. Analysis of the difference between reported intake of energy and macronutrients showed an overall pattern, albeit not statistically significant, of higher intake in the video before versus the video after group. Application of instructional videos for dietary assessment may require further research to address the validity of reported dietary intakes in those who are randomized to watch an instructional video before reporting diet compared to a control groups that does not view a video.^
Resumo:
Aim. To assess the relationships between dietary factors and colorectal cancer risk. ^ Methods. We looked at all the systematic reviews published in last ten years on the topic. ^ Results. For fruits-vegetables some studies1 were significant for heterogeneity and others2 were not. In study by Aune at al3 only fruits were significant, although all the studies had protective RR between 0.90 to 0.94. For folate only case-control group of studies, the study by Sanjoaquin et al4 was significant with p heterogeneity being 0.01 and all of them had protective effect with RR between 0.75 to 0.95, for dietary as well as total folate. For fiber study by Park et al5 p was insignificant at 0.14 an RR was 0.84. Vitamin B6 study by Larsson et al6 had significant p with RR 0.90. For dietary fat both Alexander7 and Liu8 concluded that there is insufficient evidence that dietary fat is an independent causative risk factor. Only one study by Norat et al9 out of three was able to achieve significant p heterogeneity for meat. All the studies reported RR between 1.14 to 1.35, clearly implicating meat as culprit for increasing the risk of colorectal cancer. ^ Conclusions. We would recommend the use of fruits and vegetables to be protective against colorectal cancer. Also meat consumption increases the risk of colorectal cancer.^ *Please refer to dissertation for references/footnotes.^
Resumo:
Los objetivos globales de esta tesis han sido estudiar el efecto que los carbohidratos de la dieta ejercen sobre los rendimientos productivos, la barrera intestinal, y la digestión de animales destetados a 25 días de edad. Además se ha estudiado cuál es el mejor periodo para determinar la digestibilidad fecal tras el destete a esta edad. En el primer experimento se estudió el efecto de la fibra neutro detergente soluble (FNDS) sobre la barrera intestinal, digestión, microbiota intestinal y rendimientos productivos de gazapos en gazapos en la fase post-destete. Se diseñaron tres piensos isonutritivos en los que la única fuente de variación fueron los niveles de fibra soluble. A partir de una dieta control (AH) con 103 g/kg de materia seca de FNDS y alfalfa como fuente principal de fibra, se sustituyó la mitad de esta alfalfa por una mezcla de pulpa de remolacha y pulpa de manzana (75:25) en el pienso B-AP y por una mezcla de cascarilla y concentrado de proteína de soja (88:12) en el pienso OH, obteniéndose 131 y 79 g/kg de FNDS sobre materia seca, respectivamente. Los conejos se destetaron a 25 días y fueron alimentados con los piensos experimentales hasta los 35 días de edad, momento en el que se sacrificaron para la determinación de la digestibilidad ileal aparente (DIA) de la materia seca (MS), proteína bruta (PB) y almidón, la morfología de la mucosa, y actividad enzimática en el yeyuno, el tejido linfoide asociado a la mucosa, así como la microbiota intestinal. Para la determinación de la morfología de la mucosa se utilizaron adicionalmente 19 animales lactantes de 35 días de edad. Para el estudio de la tasa de mortalidad, se utilizaron 118 animales más por tratamiento que recibieron los piensos experimentales durante las dos semanas post-destete y posteriormente un pienso comercial hasta los 60 días de edad. Los animales recibieron durante todo el experimento medicación en el agua de bebida (100 ppm de apramicina sulfato y 120 ppm de tilosina tartrato). El nivel de fibra soluble mejoró los parámetros que se utilizaron para la caracterización del estado de la barrera intestinal. Los conejos alimentados con el mayor nivel de FNDS en el pienso presentaron una mayor longitud de los villi (P=0.001), un mayor ratio longitud villi/profundidad de las criptas (8.14; P=0.001), una mayor actividad disacaridásica (8671 μmol de glucosa/g de proteína; P=0.019), así como una mayor digestibilidad ileal (96.8%; P=0.002), observándose una reducción en el flujo ileal de almidón a medida que se incrementó el nivel de fibra soluble en el pienso (1,2 vs 0,5 g/d; P=0.001). Los animales lactantes a 35 días de edad presentaron un ratio longitud de villi/profundidad de las criptas menor que el observado en aquéllos alimentados con el pienso B-AP (6.70), pero superior al de los piensos AH y OH. Niveles inferiores de NDFS tendieron (P=0.074) a incrementar la respuesta inmune de tipo celular (linfocitos CD8+). El pienso también afectó a la producción de IL2 (CD25+; P=0.029; CD5+CD25+; P=0.057), pero sin llegar a establecerse una clara relación con el nivel de fibra soluble. La diversidad de la microbiota intestinal no se vio afectada por el pienso (P ≥ 0.38). Los animales alimentados con las piensos B-AP y AH presentaron una reducción en la frecuencia de detección de Clostridium perfringens tanto en íleon (P=0.062) como en ciego (4.3 vs. 17.6%, P =0.047), comparado con el pienso OH. Además la tasa de mortalidad (118 gazapos/pienso) disminuyó de 14.4% en el pienso OH a 5.1% en el pienso B-AP. Entre los 32 y los 35 días de edad se determinó la digestibilidad fecal aparente (14/pienso) de la materia seca (MS), energía bruta (EB), proteína bruta (PB), fibra neutro detergente (FND), fibra ácido detergente (FAD) y almidón. Este grupo, junto con otros nueve animales por tratamiento se utilizaron para determinar el peso del estómago y el ciego, la concentración cecal de ácidos grasos volátiles (AGV) y amoniaco (NH3), así como las tasas de similitud de la microbiota intestinal. Además se estudiaron los rendimientos productivos (35 animales/tratamiento) de los gazapos durante todo el período de cebo, consumiendo los piensos experimentales desde el destete hasta los 35 días y posteriormente un pienso comercial hasta los 60 días de edad. Niveles crecientes de FNDS mejoraron la digestibilidad fecal de la materia seca (MS) y energía (P<0.001). La inclusión FNDS aumentó de manera lineal el peso del contenido cecal (P=0.001) y el peso del aparato digestivo completo (P=0.008), y en los días previos al sacrificio disminuyó de manera lineal el consumo medio diario (P=0.040). Se observó además, una disminución lineal (P≤0.041) del pH del estómago. No se encontró relación entre el pH, la concentración y proporciones molares de AGV y el nivel de FNDS. El pienso pareció tener un efecto, incluso superior al de la madre, sobre la tasa de similitud de la microbiota, y los efectos fueron mayores a nivel cecal que ileal. La eficacia alimenticia aumentó de manera lineal en un 12% entre piensos extremos tras el destete (25- 39d) y en un 3% en el período global de cebo con niveles mayores de NDFS. El consumo medio diario durante la fase post-destete y durante todo el período de cebo, tendió a aumen tar (P≤0.079) con niveles mayores de FNDS, sin embargo no se apreció efecto sobre la ganancia media diaria (P≥0.15). En conclusión, el incremento del nivel de fibra soluble en el pienso parece resultar beneficioso para la salud del animal ya que mejora la integridad de la mucosa, y reduce la frecuencia de detección de potenciales patógenos como C. perfringens y Campylobacter spp. Conforme a estos resultados, debería tenerse en cuenta el contenido en fibra soluble en la formulación de piensos de conejos en la fase post-destete. El objetivo del segundo experimento fue determinar el efecto de la fuente de almidón sobre la digestión, la microbiota intestinal y los rendimientos productivos en conejos destetados con 25 días de edad. Se formularon tres piensos isonutritivos en los que se modificaron las principales fuentes de almidón: trigo crudo, trigo cocido y una combinación de trigo y arroz cocido. Dos grupos de 99 y 193 animales se destetaron con 25 días de edad. El primero de ellos se utilizó para la determinación de los parámetros productivos conforme al mismo protocolo seguido en el experimento anterior. El segundo de los grupos se utilizó para la determinación de la digestibilidad fecal de 32 a 35 d, la digestibilidad ileal aparente (DIA) a 35 d, la morfología de la mucosa intestinal, los parámetros de fermentación cecal; así como, la caracterización de la microbiota intestinal. Se utilizaron además dos grupos adicionales de animales 384 (medicados) y 177 (no medicados) para estudiar el efecto de la suplementación con antibióticos en el agua de bebida sobre la mortalidad. El procesado térmico del trigo mejoró ligeramente la digestibilidad ileal del almidón (P=0.020) pero no modificó el flujo final de almidón que alcanzó el ciego, observándose una mayor frecuencia de detección de Campylobacter spp. y Ruminococcus spp. en ciego (P≤0.023), pero sin cambios a nivel ileal. El procesado térmico del trigo no afectó tampoco a los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal o la morfología de la mucosa. La sustitución parcial del trigo cocido por arroz cocido, penalizó la digestibilidad ileal del almidón (P=0.020) e incrementó el flujo ileal de este nutriente al ciego (P=0.007). Sin embargo no afectó a la mortalidad, pese a que se detectaron cambios en la microbiota tanto a nivel ileal como cecal, disminuyendo la frecuencia de detección de Campylobacter spp. (en íleon y ciego), Helicobacter spp. (en íleon) y Ruminococcus spp (en ciego) e incrementando Bacteroides spp. (en ciego) (P≤0.046). El empleo de arroz cocido en las piensos post-destete no tuvieron efectos sobre los parámetros productivos, la mortalidad, la digestibilidad ileal y fecal a excepción del almidón, o la morfología de la mucosa. La suplementación con antibiótico redujo la fre cuencia de detección de la mayoría de las bacterias estudiadas (P≤0.048), sobre todo para Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), observándose un efecto mayor a nivel ileal que cecal, lo que se asoció a la bajada significativa (P<0.001) de la mortalidad. En conclusión, los resultados de este experimento indican que la fuente de almidón afecta a la microbiota intestinal pero no influiye sobre la salud del animal. En relación al procesado, el uso de trigo cocido junto con arroz cocido no mejora los resultados obtenidos con trigo duro, si bienserían necesarios más experimentos que confirmaran este punto. El último de los experimentos se centró en un aspecto metodológico. Dado que, los conejos destetados presentan un patrón digestivo diferente al de un animal adulto resultado de su inmadurez digestiva, el objetivo buscado era tratar de determinar el mejor procedimiento para la determinación de la digestibilidad fecal en los gazapos en la fase post-destete. Para tal fin se utilizaron 15 animales/tratamiento de tres camadas diferentes que se destetaron con 25 días, suministrándoles un pienso comercial de crecimiento-cebo. Se registró el consumo medio diario y la excreción diaria de heces desde el día 25 hasta el día 40 de edad para la determinación de la digestibilidad de la MS. La camada afectó al consumo medio diario y la excreción de heces (P=0.013 y 0.014, respectivamente), observándose una tendencia (P=0.061) en la digestibilidad. La edad afectó (P<0.001) a todos estos factores, incrementándose de manera más evidente la excreción que la ingestión de materia seca en la primera semana de vida, para aumentar de forma paralela a partir de la segunda. La correlación entre el consumo medio diario fue mayor con la excreción de heces del mismo día que con la del día siguiente, por lo que se utilizó el primero para la determinación de la digestibilidad de la MS (MSd). La MSd disminuyó de manera lineal hasta los 32 días de edad (2.17±0.25 unidades porcentuales por día), mientras que permaneció constante desde los 32 a los 40 días (69.4±0.47%). Por otro lado, la desviación estándar de la MSd se redujo cuando se incrementó el período de recogida de 2 a 6 días en un 54%. Conforme a los resultados obtenidos, se puede concluir que no es aconsejable comenzar las pruebas de digestibilidad antes de los 32 días de edad y que el número de animales necesario para detectar diferencias significativas entre tratamientos dependerá del período de recogida de heces. ABSTRACT The global aim of this thesis has been to study the effect of dietary carbohydrates on growth, performance, digestion and intestinal barrier in 25-d weaned rabbits. In addition there has also been studied which is the best period to determine the fecal digestibility after weaning. The first experiment focused on the effect of Neutral Detergent Soluble Fibre (NDSF) on gut barrier function, digestion, intestinal microbiota and growth performance n rabbits in the post-weaning period. Three isonutritive diets which only varied in the levels of soluble fiber were formulated such as it described as follows: a control diet (AH) containing 103 g of neutral detergent soluble fiber, including alfalfa as main source of fiber, was replaced by a mixture of beet and apple pulp (75-25) in the B-AP diet and, by a mix of oat hulls and soybean protein concentrate (88:12) in the OH diet, resulting 131 and 79 g of NDFS/kg of dry matter, respectively. Rabbits, weaned at 25 days of age, were fed the experimental diets up to 35 days of age, moment in which they were slaughtered for apparent ileal digestibility (AID) of dry matter (DM), crude protein (CP) and starch, mucosa morphology, sucrose activity, characterization of lamina propria lymphocytes and intestinal microbiota. To assess mucosal morphology, 19 suckling 35-d-old rabbits were also used. For mortality study, besides these animals, 118 additional rabbits per treatment were fed the experimental diets for two weeks period and thereafter received a commercial diet until 60 days of age. Rabbits were water medicated during the whole experimental period (100 ppm de apramicine sulphate and 120 ppm of tylosine tartrate). Level of soluble fiber improved all the parameters used for the characterization of the intestinal barrier condition. Villous height of the jejunal mucosa increased with dietary soluble fiber (P=0.001). Villous height of jejunal mucosa increased with dietary soluble fiber (P = 0.001). Rabbits fed the highest level of soluble fiber (BA-P diet) showed the highest villous height/crypth depth ratio (8.14; P = 0.001), sucrase specific activity (8671 μmol glucose/ g protein; P = 0.019), and the greatest ileal starch digestibility (96.8%; P = 0.002). The opposite effects were observed in rabbits fed decreased levels of soluble fiber (AH and OH diets; 4.70, 5,848 μmol of glucose/g of protein, as average, respectively). The lowest ileal starch digestibility was detected for animal fed OH diet (93.2%). Suckling rabbits of the same age showed a lower villous height/crypt depth ratio (6.70) compared with the B-AP diet group, but this ration was higher that the AH or OH diet groups. Lower levels of soluble fiber tended (P = 0.074) to increase the cellular immune response (CD8+ lymphocytes). Diet affected IL-2 production (CD25+, P = 0.029; CD5+CD25+, P = 0.057), with no clear relationship between soluble fiber and IL-2. The intestinal microbiota biodiversity was not affected by diets (P ≥ 0.38). Animals fed B-AP and AH diets had a reduced cecal frequency of detection compatible with Campylobacter spp. (20.3 vs. 37.8, P = 0.074), and Clostridium perfringens (4.3 vs. 17.6%, P = 0.047), compared with the OH diet group. Moreover, the mortality rates decreased from 14.4 (OH diet) to 5.1% (B-AP diet) with the increased presence of soluble fiber in the diet. Between 32 and 35 days of age, faecal apparent digestibility of dry matter (DM), gross energy (GE), crude protein (CP), neutral detergent fiber (NDF), acid detergent fiber (ADF) and starch was determined (14/diet). This group, plus another nine rabbits/diet were used to determine weight of stomach and caecum and their contents, cecal fermentation traits and similarity rate (SR) of intestinal microbiota. Besides, growth performance parameters (35 rabbits/diet) were studied during the whole fattening period, in animals consuming the experimental feed after the weaning up to 35 days of age and later on a commercial diet up animals reached 60 days of age. Increasing levels of Neutral Detergent Soluble Fiber improved faecal dry matter and energy digestibility (P<0.001). NDSF inclusion improved linearly weight of the caecal content (P=0.001) and the total gastrointestinal tract (P=0.008), and in the previous days to slaughter a linear decrease of daily feed intake in diet with highest level of soluble fiber was also observed. Stomach pH decreased linearly with increasing levels of NDFS (P≤0.041). No relation between NDSF level on pH, concentration and molar proportion of VFA was found. Treatments appeared to influence the similarity rate of microbiota, even higher to mother effect. These effects were higher in ileum than in caecum. A linear positive effect of feed efficiency was observed, which increased around 12% in the two weeks post-weaning (25-39d) and 3% in the whole fattening period between extreme diets with highest levels of soluble fiber. Average daily feed intake during the two weeks after weaning and in the whole fattening period, tended (P≤0.079) to increase with highest levels of NDSF; although there were no effect on daily weight gain (≥0.15). In conclusion, an increase of soluble fiber in the feed seems to be beneficial for animal health, due to improve mucose integrity and reduce detection frequency of those poten tial pathogens like C. perfringens and Campylobacter spp. According to these results, level of soluble fiber should be taking care in feed rabbit formulation in the post-weaning period. The objective of the second experiment was to determine the effect of source of starch on digestion, intestinal microbiota and growth performance in twenty-five-day old weaned rabbits. To accomplish with this aim three iso-nutritive diets were formulated with different source of starch: raw wheat, boiled wheat and a combination of boiled wheat and boiled rice. Two groups of 99 and 193 rabbits were weaned at 25 days of age. The first group was used for growth performance determination following the same protocol than in previous experiment. The second group was used to determine faecal digestibility from 32 to 35 d, apparent ileal digestibility (AID) at 35 d, jejunal mucosa morphology, caecal fermentation traits and characterization of intestinal microbiota. For mortality, two additional groups of 384 (medicated) and 177 (not medicated) were used in order to study the effect of antibiotic water supply supplementation. Heat processing of starch slightly improved ileal digestibility of starch (P=0.020) but did not modify the flow of starch to the caecum. An increase in frequency of detection of Campylobacter spp. y Ruminococcus spp. was observed in the caecum (P≤0.023), with no changes at ileal level. Heat processing of wheat did not modify growth performance, mortality, ileal or faecal digestibility and mucosa morphology. Partial substitution of boiled wheat for boiled rice in the diet impaired ileal starch digestibility (P=0.020) and increased the ileal flow of this nutrient to the caecum (P=0.007). However, it did not affect mortality rate, although changes in the ileal and caecal intestinal microbiota were detected, decreasing the frequency of detection of Campylobacter spp. (both ileum and caecum), Helicobacter spp. (at ileum) and Ruminococcus spp (at caecum) and increasing the Bacteroides spp. (at caecum) (P≤0.046). The effect of boiled rice supplementation did not alter growth performance, mortality, ileal or faecal digestibility of other nutrients than starch, and mucosa morphology. Medication of rabbits reduced the ileal frequency of detection of most bacteria studied (P≤0.048), especially for Campylobacter spp., Clostridium perfringens y Propionibacterium spp. (P≤0.048), resulting the effect higher at ileal than caecal level and relating it with a strong reduction of mortality rate (P<0.001). In conclusion, the results of this experiment make think that the source of starch affects the intestinal microbiota but they do not seem to influence animal health. In relation to the effect of heat processed the use of cooked wheat or cooked rice it does not seem to im prove the results obtained with hard wheat, but there would be necessary more experiments that were confirming this point. The last experiment focused on a methodological aspect. Considering that, weaned rabbits have a different digestive pattern than older animals due to their digestive immaturity; the fixed objective was to determine the best procedure for faecal digestibility determination in young rabbits in the post-weaning period. Fifteen rabbits from 5 different litters were weaned at 25 days of age and fed with a commercial feed. Feed intake and faeces excretion were recorded daily from 25 to 40 days of age for dry matter digestibility (DMd) determination. Litter affected daily DM intake and excretion (P=0.013 y 0.014, respectively) and tended to affect DMd (P=0.061). Age affected all these factors (P<0.001), but ingestion increased slowly than dry matter excretion during the first week buth they evolved similarly in the second week. The correlation between daily feed intakes was higher with the faeces excretion of the day than with faeces excretion of the next day, and the first values were used to determine daily DMd. The DMd decreased linearly from weaning to 32 d of age (2.17±0.25 percentage units per day), whereas from 32 to 40 d remained constant (69.4±0.47%). On the other hand, average standard deviation of DMd decreased by 54% when the length of collection period increased from 2 to 6d. Consequently to the obtained results, it could be concluded that it would not be advisable to start digestibility trials before the 32 days of age and that the number of animals required to detect a significant difference among means would depend on the collection period.
Resumo:
This research studied the effects of additional fiber in the rearing phase diets on egg production, gastrointestinal tract (GIT) traits, and body measurements of brown egg-laying hens fed diets varying in energy concentration from 17 to 46 wk of age. The experiment was completely randomized with 10 treatments arranged as a 5 × 2 factorial with 5 rearing phase diets and 2 laying phase diets. During the rearing phase, treatments consisted of a control diet based on cereals and soybean meal and 4 additional diets with a combination of 2 fiber sources (cereal straw and sugar beet pulp, SBP) at 2 levels (2 and 4%). During the laying phase, diets differed in energy content (2,650 vs. 2,750 kcal AMEn/kg) but had the same amino acid content per unit of energy. The rearing diet did not affect any production trait except egg production that was lower in birds fed SBP than in birds fed straw (91.6 and 94.1%, respectively; P < 0.05). Laying hens fed the high energy diet had lower feed intake (P < 0.001), better feed conversion (P < 0.01), and greater BW gain (P < 0.05) than hens fed the low energy diet but egg production and egg weight were not affected. At 46 wk of age, none of the GIT traits was affected by previous dietary treatment. At this age, hen BW was positively related with body length (r = 0.500; P < 0.01), tarsus length (r = 0.758; P < 0.001), and body mass index (r = 0.762; P < 0.001) but no effects of type of diet on these traits were detected. In summary, the inclusion of up to 4% of a fiber source in the rearing diets did not affect GIT development of the hens but SBP reduced egg production. An increase in the energy content of the laying phase diet reduced ADFI and improved feed efficiency but did not affect any of the other traits studied.
Resumo:
Diet and physical activity patterns have been implicated as major factors in the increasing prevalence of childhood and adolescent obesity. It is estimated that between 16 and 33 percent of children and adolescents in the United States are overweight (CDC, 2000). Moreover, the CDC estimates that less than 50% of adolescents are physically active on a regular basis (CDC, 2003). Interventions must be focused to modify these behaviors. Facilitating the understanding of proper nutrition and need for physical activity among adolescents is the first step in preventing overweight and obesity and delaying the development of chronic diseases later in life (Dwyer, 2000). The purpose of this study was to compare the outcomes of students receiving one of two forms of education (both emphasizing diet and physical activity), to determine whether a computer based intervention (CBI) program using an interactive, animated CD-ROM would elicit a greater behavior change in comparison to a traditional didactic intervention (TDI) program. A convenience sample of 254 high school students aged 14-19 participated in the 6-month program. A pre-test post-test design was used, with follow-up measures taken at three months post-intervention. ^ No change was noted in total fat, saturated fat, fruit/vegetables, or fiber intake for any of the groups. There was also no change in perceived self-efficacy or perceived social support. Results did, however, indicate an increase in nutrition knowledge for both intervention groups (p<0.001). In addition, the CBI group demonstrated more positive and sustained behavior changes throughout the course of the study. These changes included a decrease in BMI (ppre/post<0.001, ppost/follow-up<0.001), number of meals skipped (ppre/post<0.001), and soda consumption (ppre/post=0.003, ppost/follow-up=0.03) and an increase in nutrition knowledge (ppre/post<0.001, ppre/follow-up <0.001), physical activity (ppre/post<0.05, p pre/follow-up<0.01), frequency of label reading (ppre/follow-up <0.0l) and in dairy consumption (ppre/post=0.03). The TDI group did show positive gains in some areas post intervention, however a return to baseline behavior was shown at follow-up. Findings of this study suggest that compared to traditional didactic teaching, computer-based nutrition and health education has greater potential to elicit change in knowledge and behavior as well as promote maintenance of the behavior change over time. ^
Resumo:
Objective. The prevalence of smoking in Aboriginal Canadians is higher than non-Aboriginal Canadians, a behavior that also tends to alter dietary patterns. Compared with the general Canadian population, maternal smoking rates are almost twice as high. The aim of this study was to compare dietary adequacy of Inuvialuit women of childbearing age comparing smokers versus non-smokers. Research methods & procedures. A cross-sectional study, where participants completed a culturally specific quantitative food frequency questionnaire. Non-parametric analysis was used to compare mean nutrient intake, dietary inadequacy and differences in nutrient density among smokers and non-smokers. Multiple logistic regression analyses were performed for key nutrients inadequacy and smoking status. Data was collected from three communities in the Beaufort Delta region of the Northwest Territories, Canada from randomly selected Inuvialuit women of childbearing age (19-44 years). Results: Of 92 participants, 75% reported being smokers. There were no significant differences in age, BMI, marital status, education, number of people in household working and/or number of self employed, and physical activity between smokers and non-smokers. Non-parametric analysis showed no differences in nutrient intake between smokers and non-smokers. Logistic regression however revealed there was a positive association between smoking and inadequacies of vitamin C (OR = 2.91, 95% CI, 1.17-5.25), iron (OR = 3.16, 95% CI, 1.27-5.90), and zinc (OR = 2.78, 95% CI, 1.12-4.94). A high percentage of women (>60%), regardless of smoking status, did not meet the dietary recommendations for fiber, vitamin D, E and potassium. Conclusions: This study provides evidence of inadequate dietary intake among Inuvialuit of childbearing age regardless of smoking behavior.
Resumo:
There is increasing interest in the role the environment plays in shaping the dietary behavior of youth, particularly in the context of obesity prevention. An overview of environmental factors associated with obesity-related dietary behaviors among youth is needed to inform the development of interventions. A systematic review of observational studies on environmental correlates of energy, fat, fruit/ vegetable, snack/fast food and soft drink intakes in children (4–12 years) and adolescents (13–18 years) was conducted. The results were summarized using the analysis grid for environments linked to obesity. The 58 papers reviewed mostly focused on sociocultural and economical–environmental factors at the household level. The most consistent associations were found between parental intake and children’s fat, fruit/vegetable intakes, parent and sibling intake with adolescent’s energy and fat intakes and parental education with adolescent’s fruit/ vegetable intake. A less consistent but positive association was found for availability and accessibility on children’s fruit/vegetable intake. Environmental factors are predominantly studied at the household level and focus on sociocultural and economic aspects. Most consistent associations were found for parental influences (parental intake and education).More studies examining environmental factors using longitudinal study designs and validated measures are needed for solid evidence to inform interventions.
Resumo:
Vitamin D deficiency and insufficiency are now seen as a contemporary health problem in Australia with possible widespread health effects not limited to bone health1. Despite this, the Vitamin D status (measured as serum 25-hydroxyvitamin D (25(OH)D)) of ambulatory adults has been overlooked in this country. Serum 25(OH)D status is especially important among this group as studies have shown a link between Vitamin D and fall risk in older adults2. Limited data also exists on the contributions of sun exposure via ultraviolet radiation and dietary intake to serum 25(OH)D status in this population. The aims of this project were to assess the serum 25(OH)D status of a group of older ambulatory adults in South East Queensland, to assess the association between their serum 25(OH)D status and functional measures as possible indicators of fall risk, obtain data on the sources of Vitamin D in this population and assess whether this intake was related to serum 25(OH)D status and describe sun protection and exposure behaviors in this group and investigate whether a relationship existed between these and serum 25(OH)D status. The collection of this data assists in addressing key gaps identified in the literature with regard to this population group and their Vitamin D status in Australia. A representative convenience sample of participants (N=47) over 55 years of age was recruited for this cross-sectional, exploratory study which was undertaken in December 2007 in south-east Queensland (Brisbane and Sunshine coast). Participants were required to complete a sun exposure questionnaire in addition to a Calcium and Vitamin D food frequency questionnaire. Timed up and go and handgrip dynamometry tests were used to examine functional capacity. Serum 25(OH)D status and blood measures of Calcium, Phosphorus and Albumin were determined through blood tests. The Mean and Median serum 25-Hydroxyvitamin D (25(OH)D) for all participants in this study was 85.8nmol/L (Standard Deviation 29.7nmol/L) and 81.0nmol/L (Range 22-158nmol/L), respectively. Analysis at the bivariate level revealed a statistically significant relationship between serum 25(OH)D status and location, with participants living on the Sunshine Coast having a mean serum 25(OH)D status 21.3nmol/L higher than participants living in Brisbane (p=0.014). While at the descriptive level there was an apparent trend towards higher outdoor exposure and increasing levels of serum 25(OH)D, no statistically significant associations between the sun measures of outdoor exposure, sun protection behaviors and phenotypic characteristics and serum 25(OH)D status were observed. Intake of both Calcium and Vitamin D was low in this sample with sixty-eight (68%) of participants not meeting the Estimated Average Requirements (EAR) for Calcium (Median=771.0mg; Range=218.0-2616.0mg), while eighty-seven (87%) did not meet the Adequate Intake for Vitamin D (Median=4.46ug; Range=0.13-30.0ug). This raises the question of how realistic meeting the new Adequate Intakes for Vitamin D is, when there is such a low level of Vitamin D fortification in this country. However, participants meeting the Adequate Intake (AI) for Vitamin D were observed to have a significantly higher serum 25(OH)D status compared to those not meeting the AI for Vitamin D (p=0.036), showing that meeting the AI for Vitamin D may play a significant role in determining Vitamin D status in this population. By stratifying our data by categories of outdoor exposure time, a trend was observed between increased importance of Vitamin D dietary intake as a possible determinant of serum 25(OH)D status in participants with lower outdoor exposures. While a trend towards higher Timed Up and Go scores in participants with higher 25(OH) D status was seen, this was only significant for females (p=0.014). Handgrip strength showed statistically significant association with serum 25(OH)D status. The high serum 25(OH)D status in our sample almost certainly explains the limited relationship between functional measures and serum 25(OH)D. However, the observation of an association between slower Time Up and Go speeds, and lower serum 25(OH)D levels, even with a small sample size, is significant as slower Timed Up and Go speeds have been associated with increased fall risk in older adults3. Multivariable regression analysis revealed Location as the only significant determinant of serum 25(OH)D status at p=0.014, with trends (p=>0.1) for higher serum 25(OH)D being shown for participants that met the AI for Vitamin D and rated themselves as having a higher health status. The results of this exploratory study show that 93.6% of participants had adequate 25(OH)D status-possibly due to measurement being taken in the summer season and the convenience nature of the sample. However, many participants do not meet their dietary Calcium and Vitamin D requirements, which may indicate inadequate intake of these nutrients in older Australians and a higher risk of osteoporosis. The relationship between serum 25(OH)D and functional measures in this population also requires further study, especially in older adults displaying Vitamin D insufficiency or deficiency.
Resumo:
This report reviews the selection, design, and installation of fiber reinforced polymer systems for strengthening of reinforced concrete or pre-stressed concrete bridges and other structures. The report is prepared based on the knowledge gained from worldwide experimental research, analytical work, and field applications of FRP systems used to strengthen concrete structures. Information on material properties, design and installation methods of FRP systems used as external reinforcement are presented. This information can be used to select an FRP system for increasing the strength and stiffness of reinforced concrete beams or the ductility of columns, and other applications. Based on the available research, the design considerations and concepts are covered in this report. In the next stage of the project, these will be further developed as design tools. It is important to note, however, that the design concepts proposed in literature have not in many cases been thoroughly developed and proven. Therefore, a considerable amount of research work will be required prior to development of the design concepts into practical design tools, which is a major goal of the current research project. The durability and long-term performance of FRP materials has been the subject of much research, which still are on going. Long-term field data are not currently available, and it is still difficult to accurately predict the life of FRP strengthening systems. The report briefly addresses environmental degradation and long-term durability issues as well. A general overview of using FRP bars as primary reinforcement of concrete structures is presented in Chapter 8. In Chapter 9, a summary of strengthening techniques identified as part of this initial stage of the research project and the issues which require careful consideration prior to practical implementation of these identified techniques are presented.
Resumo:
A worldwide interest is being generated in the use of fibre reinforced polymer composites (FRP) in rehabilitation of reinforced concrete structures. As a replacement for the traditional steel plates or external post-tensioning in strengthening applications, various types of FRP plates, with their high strength to weight ratio and good resistance to corrosion, represent a class of ideal material in external retrofitting. Within the last ten years, many design guidelines have been published to provide guidance for the selection, design and installation of FRP systems for external strengthening of concrete structures. Use of these guidelines requires understanding of a number of issues pertaining to different properties and structural failure modes specific to these materials. A research initiative funded by the CRC for Construction Innovation was undertaken (primarily at RMIT) to develop a decision support tool and a user friendly guide for use of fibre reinforced polymer composites in rehabilitation of concrete structures. The user guidelines presented in this report were developed after industry consultation and a comprehensive review of the state of the art technology. The scope of the guide was mainly developed based on outcomes of two workshops with Queensland Department of Main Roads (QDMR). The document covers material properties, recommended construction requirements, design philosophy, flexural, shear and torsional strengthening of beams and strengthening of columns. In developing this document, the guidelines published on FIB Bulletin 14 (2002), Task group 9.3, International Federation of Structural Concrete (FIB) and American Concrete Institute Committee 440 report (2002) were consulted in conjunction with provisions of the Austroads Bridge design code (1992) and Australian Concrete Structures code AS3600 (2002). In conclusion, the user guide presents design examples covering typical strengthening scenarios.
Resumo:
Background Takeaway consumption has been increasing and may contribute to socioeconomic inequalities in overweight/obesity and chronic disease. This study examined socioeconomic differences in takeaway consumption patterns, and their contributions to dietary intake inequalities. Method Cross-sectional dietary intake data from adults aged between 25 and 64 years from the Australian National Nutrition Survey (n= 7319, 61% response rate). Twenty-four hour dietary recalls ascertained intakes of takeaway food, nutrients and fruit and vegetables. Education was used as socioeconomic indicator. Data were analysed using logistic regression and general linear models. Results Thirty-two percent (n = 2327) consumed takeaway foods in the 24 hour period. Lower-educated participants were less likely than their higher-educated counterparts to have consumed total takeaway foods (OR 0.64; 95% CI 0.52, 0.80). Of those consuming takeaway foods, the lowest-educated group was more likely to have consumed “less healthy” takeaway choices (OR 2.55; 95% CI 1.73, 3.77), and less likely to have consumed “healthy” choices (OR 0.52; 95% CI 0.36, 0.75). Takeaway foods made a greater contribution to energy, total fat, saturated fat, and fibre intakes among lower than higher-educated groups. Lower likelihood of fruit and vegetable intakes were observed among “less healthy” takeaway consumers, whereas a greater likelihood of their consumption was found among “healthy” takeaway consumers. Conclusions Total and the types of takeaway foods consumed may contribute to socioeconomic inequalities in intakes of energy, total and saturated fats. However, takeaway consumption is unlikely to be a factor contributing to the lower fruit and vegetable intakes among socioeconomically-disadvantaged groups.