853 resultados para Concentrate intake


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The way in which metabolic fuels are utilised can alter the expression of behaviour in the interests of regulating energy balance and fuel availability. This is consistent with the notion that the regulation of appetite is a psychobiological process, in which physiological mediators act as drivers of behaviour. The glycogenostatic theory suggests that glycogen availability is central in eliciting negative feedback signals to restore energy homeostasis. Due to its limited storage capacity, carbohydrate availability is tightly regulated and its restoration is a high metabolic priority following depletion. It has been proposed that such depletion may act as a biological cue to stimulate compensatory energy intake in an effort to restore availability. Due to the increased energy demand, aerobic exercise may act as a biological cue to trigger compensatory eating as a result of perturbations to muscle and liver glycogen stores. However, studies manipulating glycogen availability over short-term periods (1-3 days) using exercise, diet or both have often produced equivocal findings. There is limited but growing evidence to suggest that carbohydrate balance is involved in the short-term regulation of food intake, with a negative carbohydrate balance having been shown to predict greater ad libitum feeding. Furthermore, a negative carbohydrate balance has been shown to be predictive of weight gain. However, further research is needed to support these findings as the current research in this area is limited. In addition, the specific neural or hormonal signal through which carbohydrate availability could regulate energy intake is at present unknown. Identification of this signal or pathway is imperative if a casual relationship is to be established. Without this, the possibility remains that the associations found between carbohydrate balance and food intake are incidental.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cutaneous cholecalciferol synthesis has not been considered in making recommendations for vitamin D intake. Our objective was to model the effects of sun exposure, vitamin D intake, and skin reflectance (pigmentation) on serum 25-hydroxyvitamin D (25[OH]D) in young adults with a wide range of skin reflectance and sun exposure. Four cohorts of participants (n = 72 total) were studied for 7-8 wk in the fall, winter, spring, and summer in Davis, CA [38.5° N, 121.7° W, Elev. 49 ft (15 m)]. Skin reflectance was measured using a spectrophotometer, vitamin D intake using food records, and sun exposure using polysulfone dosimeter badges. A multiple regression model (R^sup 2^ = 0.55; P < 0.0001) was developed and used to predict the serum 25(OH)D concentration for participants with low [median for African ancestry (AA)] and high [median for European ancestry (EA)] skin reflectance and with low [20th percentile, ~20 min/d, ~18% body surface area (BSA) exposed] and high (80th percentile, ~90 min/d, ~35% BSA exposed) sun exposure, assuming an intake of 200 IU/d (5 ug/d). Predicted serum 25(OH)D concentrations for AA individuals with low and high sun exposure in the winter were 24 and 42 nmol/L and in the summer were 40 and 60 nmol/L. Corresponding values for EA individuals were 35 and 60 nmol/L in the winter and in the summer were 58 and 85 nmol/L. To achieve 25(OH)D ≥75 nmol/L, we estimate that EA individuals with high sun exposure need 1300 IU/d vitamin D intake in the winter and AA individuals with low sun exposure need 2100-3100 IU/d year-round.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose was to determine intake of phytoestrogens in a sample of older Australian women, and to investigate associated lifestyle factors. Subjects were an age-stratified sample of 511 women aged 40-80 y, randomly selected from the electoral roll and participating in the Longitudinal Assessment of Ageing in Women at the Royal Brisbane and Women’s Hospital. A cross-sectional study was conducted to assess isoflavone and lignan intake over the past month from food and supplements using a 112-item phytoestrogen frequency questionnaire. Data were also collected on nutrient intakes, physical activity, smoking, alcohol, non-prescription supplements, hormone therapy, education and occupation. Logistic regression was used to evaluate associations between demographic and lifestyle variables and soy/linseed consumption while controlling for age. Isoflavone intakes were significantly higher in the younger compared to older age groups (p<0.001); there were no age-related differences in lignan intake. Forty-five percent of women consumed at least one serve of a soy and/or linseed item and were defined as a soy/linseed consumer. Median (range) intakes by consumers for isoflavones and lignans (3.9 (0-172) mg/d and 2.4 (0.1-33) mg/d) were higher than intakes by non-consumers (0.004 (0-2.6) mg/d and 1.57 (0.44-4.7) mg/d), respectively (p<0.001). Consumers had higher intakes of dietary fibre (p=0.003), energy (p=0.04) and polyunsaturated fat (p=0.004), and higher levels of physical activity (p=0.006), socio-economic position (p<0.001), education (p<0.001) and supplement use (p<0.001). Women who consumed soy or linseed foods differed in lifestyle and demographic characteristics suggesting these factors should be considered when investigating associations with chronic disease outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oral intake of ascorbic acid is essential for optimum health in human beings. Continuous ambulatory peritoneal dialysis (CAPD) patients have an increased need for ascorbic acid, because of increased loss through dialysate, reduced intake owing to nausea and loss of appetite, and increased oxidative stress. However, optimum intake is still controversial. We studied 50 clinically stable patients to determine the relationship between oral ascorbic acid intake and serum ascorbic acid (SAA) level. Total oral intake ranged from 28 mg daily to 412 mg daily. Only one patient had an oral intake of ascorbic acid below 60 mg per day. The SAA levels ranged from 1 mg/L to 36.17 mg/L. Although a strong correlation existed between intake and SAA (p < 0.001, R2 = 0.47), the variation in SAA at any given intake level was wide. Of the studied patients, 62% had an SAA < 8.7 mg/L, 40% had an SAA < 5.1 mg/L (below the level in a healthy population), and 12% had a level below 2 mg/L (scorbutic). None of the patients demonstrated clinical manifestations of scurvy. Our results show that, in CAPD patients, ascorbic acid deficiency can be reliably detected only with SAA measurements, and oral intake may influence SAA level. To maintain ascorbic acid in the normal range for healthy adults, daily oral intake needs to be increased above the U.S. recommended dietary allowance to 80-140 mg.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to explore the process, and analyse the implementation of constructability improvement and innovation result during the planning and design for sea water intake structure of fertilizer plant project. Design/methodology/approach The research methodology approach is case study method at project level. This constructability improvement process was investigated by using constructability implementation check lists, direct observation, documented lesson learned analysis and key personnel interviews. Findings The case study shows that the implementation of constructability during planning and design stage for this sea water intake structure has increased the project performance as well as improved the schedule by 5 months (14.21%) and reduced the project cost by 15.35%. Research limitations/implications This case study was limited to three (3) previous sea water intake projects as references and one (1) of new method sea water intake structure at fertilizer plant project. Practical implications A constructability improvement check list using theory and lesson learned for the specific construction project was documented. Originality/value The findings support the relevant study of constructability and provide specific lesson learned for three (3) previous project and one (1) of the new innovation method of the construction project and documented by the company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Alcoholism imposes a tremendous social and economic burden. There are relatively few pharmacological treatments for alcoholism, with only moderate efficacy, and there is considerable interest in identifying additional therapeutic options. Alcohol exposure alters SK-type potassium channel (SK) function in limbic brain regions. Thus, positive SK modulators such as chlorzoxazone (CZX), a US Food and Drug Administration–approved centrally acting myorelaxant, might enhance SK function and decrease neuronal activity, resulting in reduced alcohol intake. Methods We examined whether CZX reduced alcohol consumption under two-bottle choice (20% alcohol and water) in rats with intermittent access to alcohol (IAA) or continuous access to alcohol (CAA). In addition, we used ex vivo electrophysiology to determine whether SK inhibition and activation can alter firing of nucleus accumbens (NAcb) core medium spiny neurons. Results Chlorzoxazone significantly and dose-dependently decreased alcohol but not water intake in IAA rats, with no effects in CAA rats. Chlorzoxazone also reduced alcohol preference in IAA but not CAA rats and reduced the tendency for rapid initial alcohol consumption in IAA rats. Chlorzoxazone reduction of IAA drinking was not explained by locomotor effects. Finally, NAcb core neurons ex vivo showed enhanced firing, reduced SK regulation of firing, and greater CZX inhibition of firing in IAA versus CAA rats. Conclusions The potent CZX-induced reduction of excessive IAA alcohol intake, with no effect on the more moderate intake in CAA rats, might reflect the greater CZX reduction in IAA NAcb core firing observed ex vivo. Thus, CZX could represent a novel and immediately accessible pharmacotherapeutic intervention for human alcoholism. Key Words: Alcohol intake; intermittent; neuro-adaptation; nucleus accumbens; SK potassium channel

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lower fruit and vegetable intake among socioeconomically disadvantaged groups has been well documented, and may be a consequence of a higher consumption of take-out foods. This study examined whether, and to what extent, take-out food consumption mediated (explained) the association between socioeconomic position and fruit and vegetable intake. A cross-sectional postal survey was conducted among 1500 randomly selected adults aged 25–64 years in Brisbane, Australia in 2009 (response rate = 63.7%, N = 903). A food frequency questionnaire assessed usual daily servings of fruits and vegetables (0 to 6), overall take-out consumption (times/week) and the consumption of 22 specific take-out items (never to ≥once/day). These specific take-out items were grouped into “less healthy” and “healthy” choices and indices were created for each type of choice (0 to 100). Socioeconomic position was ascertained by education. The analyses were performed using linear regression, and a bootstrap re-sampling approach estimated the statistical significance of the mediated effects. Mean daily serves of fruits and vegetables was 1.89 (SD 1.05) and 2.47 (SD 1.12) respectively. The least educated group were more likely to consume fewer serves of fruit (B= –0.39, p<0.001) and vegetables (B= –0.43, p<0.001) compared with the highest educated. The consumption of “less healthy” take-out food partly explained (mediated) education differences in fruit and vegetable intake; however, no mediating effects were observed for overall and “healthy” take-out consumption. Regular consumption of “less healthy” take-out items may contribute to socioeconomic differences in fruit and vegetable intake, possibly by displacing these foods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In humans the presence of negative affect is thought to promote food intake, but there is widespread variability. Susceptibility to negative affect-induced eating may depend on trait eating behaviours, notably ‘emotional eating’, ‘restrained eating’ and ‘disinhibited eating’, but the evidence is not consistent. In the present study, 30 non-obese, non-dieting women were given access to palatable food whilst in a state of negative or neutral affect, induced by a validated autobiographical recall technique. As predicted, food intake was higher in the presence of negative affect; however, this effect was moderated by the pattern of eating behaviour traits and enhanced wanting for the test food. Specifically, the High Restraint-High Disinhibition subtype in combination with higher scores on emotional eating and food wanting was able to predict negative-affect intake (adjusted R2 = .61). In the absence of stress, individuals who are both restrained and vulnerable to disinhibited eating are particularly susceptible to negative affect food intake via stimulation of food wanting. Identification of traits that predispose individuals to overconsume and a more detailed understanding of the specific behaviours driving such overconsumption may help to optimise strategies to prevent weight gain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The idea of body weight regulation implies that a biological mechanism exerts control over energy expenditure and food intake. This is a central tenet of energy homeostasis. However, the source and identity of the controlling mechanism have not been identified, although it is often presumed to be some long-acting signal related to body fat, such as leptin. Using a comprehensive experimental platform, we have investigated the relationship between biological and behavioural variables in two separate studies over a 12-week intervention period in obese adults (total n 92). All variables have been measured objectively and with a similar degree of scientific control and precision, including anthropometric factors, body composition, RMR and accumulative energy consumed at individual meals across the whole day. Results showed that meal size and daily energy intake (EI) were significantly correlated with fat-free mass (FFM, P values ,0·02–0·05) but not with fat mass (FM) or BMI (P values 0·11–0·45) (study 1, n 58). In study 2 (n 34), FFM (but not FM or BMI) predicted meal size and daily EI under two distinct dietary conditions (high-fat and low-fat). These data appear to indicate that, under these circumstances, some signal associated with lean mass (but not FM) exerts a determining effect over self-selected food consumption. This signal may be postulated to interact with a separate class of signals generated by FM. This finding may have implications for investigations of the molecular control of food intake and body weight and for the management of obesity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Precise protein quantification is essential in clinical dietetics, particularly in the management of renal, burn and malnourished patients. The EP-10 was developed to expedite the estimation of dietary protein for nutritional assessment and recommendation. The main objective of this study was to compare the validity and efficacy of the EP-10 with the American Dietetic Association’s “Exchange List for Meal Planning” (ADA-7g) in quantifying dietary protein intake, against computerised nutrient analysis (CNA). Protein intake of 197 food records kept by healthy adult subjects in Singapore was determined thrice using three different methods – (1) EP-10, (2) ADA-7g and (3) CNA using SERVE program (Version 4.0). Assessments using the EP-10 and ADA-7g were performed by two assessors in a blind crossover manner while a third assessor performed the CNA. All assessors were blind to each other’s results. Time taken to assess a subsample (n=165) using the EP-10 and ADA-7g was also recorded. Mean difference in protein intake quantification when compared to the CNA was statistically non-significant for the EP-10 (1.4 ± 16.3 g, P = .239) and statistically significant for the ADA-7g (-2.2 ± 15.6 g, P = .046). Both the EP-10 and ADA-7g had clinically acceptable agreement with the CNA as determined via Bland-Altman plots, although it was found that EP-10 had a tendency to overestimate with protein intakes above 150 g. The EP-10 required significantly less time for protein intake quantification than the ADA-7g (mean time of 65 ± 36 seconds vs. 111 ± 40 seconds, P < .001). The EP-10 and ADA-7g are valid clinical tools for protein intake quantification in an Asian context, with EP-10 being more time efficient. However, a dietician’s discretion is needed when the EP-10 is used on protein intakes above 150g.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nutrition interventions in the form of both self-management education and individualised diet therapy are considered essential for the long-term management of type 2 diabetes mellitus (T2DM). The measurement of diet is essential to inform, support and evaluate nutrition interventions in the management of T2DM. Barriers inherent within health care settings and systems limit ongoing access to personnel and resources, while traditional prospective methods of assessing diet are burdensome for the individual and often result in changes in typical intake to facilitate recording. This thesis investigated the inclusion of information and communication technologies (ICT) to overcome limitations to current approaches in the nutritional management of T2DM, in particular the development, trial and evaluation of the Nutricam dietary assessment method (NuDAM) consisting of a mobile phone photo/voice application to assess nutrient intake in a free-living environment with older adults with T2DM. Study 1: Effectiveness of an automated telephone system in promoting change in dietary intake among adults with T2DM The effectiveness of an automated telephone system, Telephone-Linked Care (TLC) Diabetes, designed to deliver self-management education was evaluated in terms of promoting dietary change in adults with T2DM and sub-optimal glycaemic control. In this secondary data analysis independent of the larger randomised controlled trial, complete data was available for 95 adults (59 male; mean age(±SD)=56.8±8.1 years; mean(±SD)BMI=34.2±7.0kg/m2). The treatment effect showed a reduction in total fat of 1.4% and saturated fat of 0.9% energy intake, body weight of 0.7 kg and waist circumference of 2.0 cm. In addition, a significant increase in the nutrition self-efficacy score of 1.3 (p<0.05) was observed in the TLC group compared to the control group. The modest trends observed in this study indicate that the TLC Diabetes system does support the adoption of positive nutrition behaviours as a result of diabetes self-management education, however caution must be applied in the interpretation of results due to the inherent limitations of the dietary assessment method used. The decision to use a close-list FFQ with known bias may have influenced the accuracy of reporting dietary intake in this instance. This study provided an example of the methodological challenges experienced with measuring changes in absolute diet using a FFQ, and reaffirmed the need for novel prospective assessment methods capable of capturing natural variance in usual intakes. Study 2: The development and trial of NuDAM recording protocol The feasibility of the Nutricam mobile phone photo/voice dietary record was evaluated in 10 adults with T2DM (6 Male; age=64.7±3.8 years; BMI=33.9±7.0 kg/m2). Intake was recorded over a 3-day period using both Nutricam and a written estimated food record (EFR). Compared to the EFR, the Nutricam device was found to be acceptable among subjects, however, energy intake was under-recorded using Nutricam (-0.6±0.8 MJ/day; p<0.05). Beverages and snacks were the items most frequently not recorded using Nutricam; however forgotten meals contributed to the greatest difference in energy intake between records. In addition, the quality of dietary data recorded using Nutricam was unacceptable for just under one-third of entries. It was concluded that an additional mechanism was necessary to complement dietary information collected via Nutricam. Modifications to the method were made to allow for clarification of Nutricam entries and probing forgotten foods during a brief phone call to the subject the following morning. The revised recording protocol was evaluated in Study 4. Study 3: The development and trial of the NuDAM analysis protocol Part A explored the effect of the type of portion size estimation aid (PSEA) on the error associated with quantifying four portions of 15 single foods items contained in photographs. Seventeen dietetic students (1 male; age=24.7±9.1 years; BMI=21.1±1.9 kg/m2) estimated all food portions on two occasions: without aids and with aids (food models or reference food photographs). Overall, the use of a PSEA significantly reduced mean (±SD) group error between estimates compared to no aid (-2.5±11.5% vs. 19.0±28.8%; p<0.05). The type of PSEA (i.e. food models vs. reference food photograph) did not have a notable effect on the group estimation error (-6.7±14.9% vs. 1.4±5.9%, respectively; p=0.321). This exploratory study provided evidence that the use of aids in general, rather than the type, was more effective in reducing estimation error. Findings guided the development of the Dietary Estimation and Assessment Tool (DEAT) for use in the analysis of the Nutricam dietary record. Part B evaluated the effect of the DEAT on the error associated with the quantification of two 3-day Nutricam dietary records in a sample of 29 dietetic students (2 males; age=23.3±5.1 years; BMI=20.6±1.9 kg/m2). Subjects were randomised into two groups: Group A and Group B. For Record 1, the use of the DEAT (Group A) resulted in a smaller error compared to estimations made without the tool (Group B) (17.7±15.8%/day vs. 34.0±22.6%/day, p=0.331; respectively). In comparison, all subjects used the DEAT to estimate Record 2, with resultant error similar between Group A and B (21.2±19.2%/day vs. 25.8±13.6%/day; p=0.377 respectively). In general, the moderate estimation error associated with quantifying food items did not translate into clinically significant differences in the nutrient profile of the Nutricam dietary records, only amorphous foods were notably over-estimated in energy content without the use of the DEAT (57kJ/day vs. 274kJ/day; p<0.001). A large proportion (89.6%) of the group found the DEAT helpful when quantifying food items contained in the Nutricam dietary records. The use of the DEAT reduced quantification error, minimising any potential effect on the estimation of energy and macronutrient intake. Study 4: Evaluation of the NuDAM The accuracy and inter-rater reliability of the NuDAM to assess energy and macronutrient intake was evaluated in a sample of 10 adults (6 males; age=61.2±6.9 years; BMI=31.0±4.5 kg/m2). Intake recorded using both the NuDAM and a weighed food record (WFR) was coded by three dietitians and compared with an objective measure of total energy expenditure (TEE) obtained using the doubly labelled water technique. At the group level, energy intake (EI) was under-reported to a similar extent using both methods, with the ratio of EI:TEE was 0.76±0.20 for the NuDAM and 0.76±0.17 for the WFR. At the individual level, four subjects reported implausible levels of energy intake using the WFR method, compared to three using the NuDAM. Overall, moderate to high correlation coefficients (r=0.57-0.85) were found across energy and macronutrients except fat (r=0.24) between the two dietary measures. High agreement was observed between dietitians for estimates of energy and macronutrient derived for both the NuDAM (ICC=0.77-0.99; p<0.001) and WFR (ICC=0.82-0.99; p<0.001). All subjects preferred using the NuDAM over the WFR to record intake and were willing to use the novel method again over longer recording periods. This research program explored two novel approaches which utilised distinct technologies to aid in the nutritional management of adults with T2DM. In particular, this thesis makes a significant contribution to the evidence base surrounding the use of PhRs through the development, trial and evaluation of a novel mobile phone photo/voice dietary record. The NuDAM is an extremely promising advancement in the nutritional management of individuals with diabetes and other chronic conditions. Future applications lie in integrating the NuDAM with other technologies to facilitate practice across the remaining stages of the nutrition care process.