62 resultados para Fuid intake
Resumo:
Background/Objectives: Reduced food intake, appetite loss and alteration of ghrelin and PYY(3-36) secretion have been suggested to have a function in the loss of body weight commonly observed after gastrectomy. The objective of this study was to investigate the circulating concentrations of ghrelin and PYY(3-36) and their relationships with food intake, appetite and resting energy expenditure (REE) after gastrectomy plus vagotomy. Subjects/Methods: Seven patients with total gastrectomy (TG), 14 with partial gastrectomy (PG) and 10 healthy controls were studied. Habitual food intake and REE was assessed; fasting and postprandial plasma total ghrelin, PYY(3-36) concentrations and appetite ratings were determined after ingestion of a liquid test meal. Results: Differently from PG and controls, fasting ghrelin correlated with REE, and a higher energy intake was observed in the TG group. Fasting plasma ghrelin concentrations were lower in TG compared with controls, and no ghrelin response to the meal was observed in either PG or TG. Fasting plasma PYY(3-36) concentrations were not different among the groups. There was an early and exaggerated postprandial rise in PYY(3-36) levels in both PG and TG groups, but not in controls. No effect of ghrelin or PYY(3-36) concentrations was observed on hunger, prospective consumption or fullness ratings. Conclusions: Total ghrelin and PYY(3-36) do not seem to be involved with appetite or energy intake regulation after gastrectomy plus vagotomy. Ghrelin secreted by sources other than stomach is likely to have a function in the long-term regulation of body weight after TG. European Journal of Clinical Nutrition (2010) 64, 845-852; doi: 10.1038/ejcn.2010.88; published online 19 May 2010
Resumo:
Adult rats submitted to perinatal salt overload presented renin-angiotensin system (RAS) functional disturbances. The RAS contributes to the renal development and renal damage in a 5/6 nephrectomy model. The aim of the present study was to analyze the renal structure and function of offspring from dams that received a high-salt intake during pregnancy and lactation. We also evaluated the influence of the prenatal high-salt intake on the evolution of 5/6 nephrectomy in adult rats. A total of 111 sixty-day-old rat pups from dams that received saline or water during pregnancy and lactation were submitted to 5/6 nephrectomy (nephrectomized) or to a sham operation (sham). The animals were killed 120 days after surgery, and the kidneys were removed for immunohistochemical and histological analysis. Systolic blood pressure (SBP), albuminuria, and glomerular filtration rate (GFR) were evaluated. Increased SBP, albuminuria, and decreased GFR were observed in the rats from dams submitted to high-sodium intake before surgery. However, there was no difference in these parameters between the groups after the 5/6 nephrectomy. The scores for tubulointerstitial lesions and glomerulosclerosis were higher in the rats from the sham saline group compared to the same age control rats, but there was no difference in the histological findings between the groups of nephrectomized rats. In conclusion, our data showed that the high-salt intake during pregnancy and lactation in rats leads to structural changes in the kidney of adult offspring. However, the progression of the renal lesions after 5/6 nephrectomy was similar in both groups.
Resumo:
Salt iodination and excessive iodine intake among schoolchildren. The objective of the present study was to evaluate the urinary excretion of iodine and relate it to the amount present in salt for human consumption. The study involved 145 children from two schools: a rural one and an urban one. We performed anthropometric measurements and collected a urine sample and a kitchen salt sample from each child. In the rural school, 3.38% of children had iodine deficiency. However, most of the values of urinary iodine were above 300 mu g/L (62.03%) and 59.49% of the kitchen salt samples contained 20 to 60 mg iodine per kilo of salt. In the urban school, 3.03% of the children had urinary iodine excretion of less than 100 mu g/L and 90.91% of the children had urinary iodine values exceeding 300 mu g/L. Of this total, 84.85% of the kitchen salt samples contained 20 to 60 mg iodine per kilo of salt. Iodine deficiency is controlled in this population, with the current reality showing a high prevalence of excess urinary iodine.
Resumo:
There has been no comparison of fluoride (F) intake by pre-school children receiving more traditional sources of systemic F. The aim of this study was to estimate the dietary F intake by children receiving F from artificially fluoridated water (AFW-Brazil, 0.6-0.8 mg F/L), naturally fluoridated water (NFW-Brazil, 0.6-0.9 mg F/L), fluoridated salt (FS-Peru, 180-200 mg F/Kg), and fluoridated milk (FM-Peru, 0.25 mg F). Children (n = 21-26) aged 4-6 yrs old participated in each community. A non-fluoridated community (NoF) was evaluated as the control population. Dietary F intake was monitored by the ""duplicate plate"" method, with different constituents (water, other beverages, and solids). F was analyzed with an ion-selective electrode. Data were tested by Kruskall-Wallis and Dunn`s tests (p < 0.05). Mean (+/- SD) F intake (mg/Kg b.w./day) was 0.04 +/- 0.01(b), 0.06 +/- 0.02(a,b), 0.05 +/- 0.02(a,b), 0.06 +/- 0.01(a), and 0.01 +/- 0.00(c) for AFW/NFW/FS/FM/NoF, respectively. The main dietary contributors for AFW/NFW and FS/FM/NoF were water and solids, respectively. The results indicate that the dietary F intake must be considered before a systemic method of fluoridation is implemented.
Resumo:
This study evaluated the kinetics of fluoride in plasma, femur surface and the whole femur of rats, after chronic exposure to different water fluoride levels was interrupted. Four groups of Wistar rats received drinking water containing 0, 5, 15 or 50 mu g F/ml for 60 days (n = 50/group). The animals were euthanized immediately after exposure to fluoride or after 7, 30, 90 or 180 days (n = 10/subgroup). Plasma and femurs were collected. Fluoride on the femur surface, whole femur and plasma was analyzed with an electrode. Data were analyzed using ANOVA and Tukey`s test (p < 0.05). The increase in plasma fluoride levels was significant only for the 50 mu g F/ml group at 0 and 7 days. Regarding bone surface and whole bone, for most groups, significant increases in fluoride concentrations were observed with the increase in water fluoride concentrations at each time of euthanasia. For fluoride doses up to 15 mu g F/ml, femur surface fluoride levels were reestablished 180 days after the exposure was discontinued, which Was not valid for whole femur or for higher fluoride doses. We found a different kinetics of fluoride in plasma,femur surface and the whole femur of rats after chronic exposure to fluoride is interrupted. Copyright 2008 Prous Science, S.A.U. or its licensors. All rights reserved.
Resumo:
The study investigated whether chronic ethanol (ETH) intake and subsequent ETH exposure of cell cultures affects osteoblast differentiation by evaluating key parameters of in vitro osteogenesis. Rats were treated with 5-20% (0.85-3.43 mM) ETH, increasing by 5% per week for a period of 4 weeks (habituation), after which the 20% level was maintained for 15 days (chronic intake). Bone-marrow stem cells from control (CONT) or ETH-treated rats were cultured in osteogenic medium which was either supplemented (ETH) or not supplemented (CONT) with 1.3 mm ethanol. Thus, four groups relating to rat treatment/culture supplementation were evaluated: (1) CONT/CONT, (2) ETH/CONT, (3) CONT/ETH and (4) ETH/ETH Cell morphology, proliferation and viability, total protein content, alkaline phosphatase (ALP) activity and bone-like nodule formation were evaluated. Chronic ethanol intake significantly reduced both food and liquid consumption and body weight gain. No difference was seen in cell morphology among treatments. Cell number was affected at 7 and 10 days as follows: CONT/CONT = CONT/ETH < ETH/CONT = ETH/ETH. Doubling time between 3 and 10 days was greater in groups of CONT animals: ETH/ETH = ETH/CONT < CONT/ETH = CONT/CONT. Cell viability and ALP activity were not affected by either animal treatment or culture exposure to ethanol. At day 21, the total protein content was affected as follows: ETH/ETH = CONT/ETH < ETH/CONT = CONT/CONT. Bone-like nodule formation was affected as follows: ETH/ETH < CONT/ETH < ETH/CONT < CONT/CONT. These results show that chronic ethanol intake, followed by the exposure of osteoblasts to ethanol, inhibited the differentiation of osteoblasts, as indicated by an increased proliferation rate and reduced bone-like nodule formation. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
Objective: To estimate the prevalence of inadequate nutrient intake among adolescents and the association between socio-economic variables and nutritional status. Design: Cross-sectional study with a population-based sample. Settings: The usual nutrient intake distribution was estimated using the Iowa State University method. The Estimated Average Requirement cut-off point method was used to determine the proportion of adolescents with inadequate intake for each nutrient, according to sex, income, parental educational level and nutritional status. Subjects: Twenty-four-hour dietary recalls were applied in 525 male and female Brazilian adolescents aged 14-18 years. Results: The highest prevalence of inadequate nutrient intake was observed for vitamin E (99% in both sexes). For male and female adolescents, the prevalence of inadequate intake was: Mg, 89% and 84%; vitamin A, 78% and 71 %; vitamin C, 79% and 53%; and vitamin B(6), 21% and 33%, respectively. The prevalence of inadequate intake for niacin, thiamin, riboflavin, Se, Cu and vitamin B(12) was <15 %. Individuals in the lower income and lower parental educational level strata had the highest risk of having inadequate intake for P, riboflavin and vitamins A, B(6) and B(12). Compared with non-overweight individuals, overweight individuals had a higher risk of inadequate intake for Mg, vitamin A, P, thiamin and riboflavin. Conclusions: The present study found a high prevalence of inadequate intake of nutrients that are recognised as being protective against chronic diseases. Adolescents in the lower income and lower parental educational level strata were less likely to have their nutrient intake requirements met.
Resumo:
Objective: We evaluated the relation between overweight and calcium intake in adults living in the municipality of Sao Paulo, Brazil. Methods: This was a cross-sectional population-based Study on a sample of 1459 adults that was obtained by multistage cluster sampling. Dietary intake was measured by the 24-h recall method. Poisson`s and linear regression analyses were performed to evaluate the relation between overweight and quartiles of calcium intake adjusted for energy. Results: The prevalence of overweight was 43.1% and the average adjusted calcium intake was 448.6 mg. In the linear regression. analyses, the regression coefficient for adjusted calcium was significant and negative (P = 0.019, beta(1) = -0.0001). Although evaluated by quartiles, the prevalence ratio for overweight in the first quartile of calcium intake was 1.24 (95% confidence interval 1.00-1.54) and that in the second quartile was 1.24 (95% confidence interval 1.03-1.49). Conclusion: In the present study, calcium intake showed a significant negative association with body mass index. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
To determine whether changes in dietary intakes predict weight loss, we studied 80 overweight adults who attended a nutritional counseling program during 6 months of follow-up at a primary health care center in Brazil. Habitual diet was assessed using a validated food frequency questionnaire at baseline and after 6 months. The mean age (+/-SD) of the participants was 46.5 +/- 9.5 years, and their mean body mass index was 29 +/- 3 kg/m(2) at baseline. After 6 months, the differences in body weight and fruit/vegetable intake were -1.4 +/- 3 kg and 109 +/- 320 g daily, respectively. Using multiple linear regression models adjusted for age, sex, changes in walking time, and total energy intake, the increased intake of dietary fiber from fruits/vegetables was associated with a greater weight loss (beta 1 [95% confidence interval (CI)] = -0.180 [-0.269, -0.0911) after 6 months of follow-up. Similar results were observed for increased intake of vegetables (beta 1 [95% CI] = -0.00497 [-0.008, -0.0021) and fruits (beta 1 [95% CI] = -0.00290 [-0.005, -0.001]) as predictors of weight loss. The increase of 100 g/d of vegetables and fruits represented a body weight loss of 500 and 300 g after 6 months, respectively (P <.05). Our findings support the relevance of increased intakes of fruits and vegetables that may help avoid weight gain in overweight adults. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Background Dietary calcium intake has been described as being a negative contributor to adiposity. In adolescents, this relationship is not well established. The objectives of the present study were to compare the calcium intake of normal-weight and obese adolescents and to evaluate its relationship with adiposity and insulin resistance. Methods A cross-sectional analysis of 96 post-pubertal adolescents; 47 normal weight and 49 obese, mean age 16.6 (SD +/- 1.3) years. Body composition was assessed by dual-energy X-ray absorptiometry. Dietary intake was evaluated using a 3-day dietary record. The biochemical evaluation comprised the measurements of serum lipids, lipoproteins, glucose and insulin. Insulin resistance was calculated using the Homeostasis Model Assessment of Insulin resistance (HOMA-IR). Results The mean calcium intake, adjusted for energy, was lower in obese adolescents, 585.2 (+/- 249.9) mg, than in normal weight adolescents, 692.1 (+/- 199.5) mg. Only 4% of adolescents had an adequate intake of calcium. Calcium intake was inversely associated with body trunk fat, insulin and HOMA-IR in the obese group. The quartile analysis of calcium intake provided evidence that girls in the highest quartile had decreased adiposity and insulin resistance. Conclusions This study showed a negative relationship between calcium intake and body fat and insulin resistance, mainly in obese girls, and demonstrates the importance of an increased dietary calcium intake.
Resumo:
Objective Underreporting of energy intake is prevalent in food surveys, but there is controversy about which dietary assessment method provides greater underreporting rates. Our objective is to compare validity of self-reported energy intake obtained by three dietary assessment methods with total energy expenditure (TEE) obtained by doubly labeled water (DLW) among Brazilian women. Design We used a cross-sectional study. Subjects/setting Sixty-five females aged 18 to 57 years (28 normal-weight, 10 over-weight, and 27 obese) were recruited from two universities to participate. Main outcome measures TEE determined by DLW, energy intake estimated by three 24-hour recalls, 3-day food record, and a food frequency questionnaire (FFQ). Statistical analyses performed Regression and analysis of variance with repeated measures compared TEE and energy intake values, and energy intake-to-TEE ratios and energy intake-TEE values between dietary assessment methods. Bland and Altman plots were provided for each method. chi(2) test compared proportion of underreporters between the methods. Results Mean TEE was 2,622 kcal (standard deviation [SD] =490 kcal), while mean energy intake was 2,078 kcal (SD=430 kcal) for the diet recalls; 2,044 kcal (SD=479 kcal) for the food record and 1,984 kcal (SD=832 kcal) for the FFQ (all energy intake values significantly differed from TEE; P<0.0001). Bland and Altman plots indicated great dispersion, negative mean differences between measurements, and wide limits of agreement. Obese subjects underreported more than normal-weight subjects in the diet recalls and in the food records, but not in the FFQ. Years of education, income and ethnicity were associated with reporting accuracy. Conclusions The FFQ produced greater under- and overestimation of energy intake. Underreporting of energy intake is a serious and prevalent error in dietary self-reports provided by Brazilian women, as has been described in studies conducted in developed countries.
Resumo:
The aim of the present study was to determine whether under-reporting rates vary between dietary pattern Clusters. Subjects were sixty-five Brazilian women. During 3 weeks, anthropometric data were collected. total energy expenditure (TEE) was determined by the doubly labelled water method and diet Was Measured. Energy intake (El) and the daily frequency of consumption per 1000 kJ of twenty-two food groups were obtained from a FFQ. These frequencies were entered into a Cluster analysis procedure in order to obtain dietary patterns. Under-reporters were defined Lis those who did not lose more than 1 kg of body weight during the study and presented EI:TEE less than 0.82. Three dietary pattern clusters were identified and named according to their most recurrent food groups: sweet foods (SW). starchy foods (ST) and health), (H). Subjects from the healthy cluster had the lowest mean EI:TEE (SW = 0.86, ST = 0.71 and H = 0.58: P = 0.003) and EI - TEE (SW = -0.49 MJ, ST = - 3.20 MJ and H = -5.09 MJ; P = 0.008). The proportion of Under-reporters was 45.2 (95 % CI 35.5, 55.0) % in the SW Cluster: 58.3 (95 % CI 48.6, 68.0) % in the ST Cluster and 70.0 (95 % CI 61.0, 79) % in the H cluster (P=0.34). Thus, in Brazilian women, Under-reporting of El is not uniformly distributed among, dietary pattern clusters and tends to be more severe among subjects from the healthy cluster. This cluster is more consistent with both dietary guidelines and with what lay individuals usually consider `healthy eating`.
Resumo:
Four rumen-fistulated Holstein heifers (134 +/- 1 kg initial BW) were used in a 4 x 4 Latin square design to determine the effects of delaying daily feed delivery time on intake, ruminal fermentation, behavior, and stress response. Each 3-wk experimental period was preceded by 1 wk in which all animals were fed at 0800 h. Feed bunks were cleaned at 0745 h and feed offered at 0800 h (T0, no delay), 0900 (T1), 1000 (T2), and 1100 (T3) from d1 to 21 with measurements taken during wk 1 and 3. Heifers were able to see each other at all times. Concentrate and barley straw were offered in separate compartments of the feed bunks, once daily and for ad libitum intake. Ruminal pH and saliva cortisol concentrations were measured at 0, 4, 8, and 12 h postfeeding on d 3 and 17 of each experimental period. Fecal glucocorticoid metabolites were measured on d 17. Increasing length of delay in daily feed delivery time resulted in a quadratic response in concentrate DMI (low in T1 and T2; P = 0.002), whereas straw DMI was greatest in T1 and T3 (cubic P = 0.03). Treatments affected the distribution of DMI within the day with a linear decrease observed between 0800 and 1200 h but a linear increase during nighttimes (2000 to 0800 h), whereas T1 and T2 had reduced DMI between 1200 and 1600 h (quadratic P = 0.04). Water consumption (L/d) was not affected but decreased linearly when expressed as liters per kilogram of DMI (P = 0.01). Meal length was greatest and eating rate slowest in T1 and T2 (quadratic P <= 0.001). Size of the first meal after feed delivery was reduced in T1 on d 1 (cubic P = 0.05) and decreased linearly on d 2 (P = 0.01) after change. Concentrate eating and drinking time (shortest in T1) and straw eating time (longest in T1) followed a cubic trend (P = 0.02). Time spent lying down was shortest and ruminating in standing position longest in T1 and T2. Delay of feeding time resulted in greater daily maximum salivary cortisol concentration (quadratic P = 0.04), which was greatest at 0 h in T1 and at 12 h after feeding in T2 (P < 0.05). Daily mean fecal glucocorticoid metabolites were greatest in T1 and T3 (cubic P = 0.04). Ruminal pH showed a treatment effect at wk 1 because of increased values in T1 and T3 (cubic P = 0.01). Delaying feed delivery time was not detrimental for rumen function because a stress response was triggered, which led to reduced concentrate intake, eating rate, and size of first meal, and increased straw intake. Increased salivary cortisol suggests that animal welfare is compromised.
Resumo:
Objective: This study investigated the effect of different sodium content diets on rat adipose tissue carbohydrate metabolism and insulin sensitivity. Methods and Procedures: Male Wistar rats were fed on normal- (0.5% Na+; NS), high- (3.12% Na+; HS), or low-sodium (0.06% Na+; LS) diets for 3, 6, and 9 weeks after weaning. Blood pressure (BP) was measured using a computerized tail-cuff system. An intravenous insulin tolerance test (ivITT) was performed in fasted animals. At the end of each period, rats were killed and blood samples were collected for glucose and insulin determinations. The white adipose tissue (WAT) from abdominal and inguinal subcutaneous (SC) and periepididymal (PE) depots were weighed and processed for adipocyte isolation and measurement of in vitro rates of insulin-stimulated 2-deoxy-d-[H-3]-glucose uptake (2DGU) and conversion of -[U-C-14]-glucose into (CO2)-C-14. Results: After 6 weeks, HS diet significantly increased the BP, SC and PE WAT masses, PE adipocyte size, and plasma insulin concentration. The sodium dietary content did not influence the whole-body insulin sensitivity. A higher half-maximal effective insulin concentration (EC50) from the dose - response curve of 2DGU and an increase in the insulin-stimulated glucose oxidation rate were observed in the isolated PE adipocytes from HS rats. Discussion: The chronic salt overload enhanced the adipocyte insulin sensitivity for glucose uptake and the insulin-induced glucose metabolization, contributing to promote adipocyte hypertrophy and increase the mass of several adipose depots, particularly the PE fat pad.
Resumo:
Dental erosion is a type of wear caused by non bacterial acids or chelation. There is evidence of a significant increase in the prevalence of dental wear in the deciduous and permanent teeth as a consequence of the frequent intake of acidic foods and drinks, or due to gastric acid which may reach the oral cavity following reflux or vomiting episodes. The presence of acids is a prerequisite for dental erosion, but the erosive wear is complex and depends on the interaction of biological, chemical and behavioral factors. Even though erosion may be defined or described as an isolated process, in clinical situations other wear phenomena are expected to occur concomitantly, such as abrasive wear (which occurs, e.g, due to tooth brushing or mastication). In order to control dental loss due to erosive wear it is crucial to take into account its multifactorial nature, which predisposes some individuals to the condition.