710 resultados para dietary preference
Resumo:
Early experiences are of potential importance in shaping long-term behavior. This study examined the relative influence of prenatal and/or early postnatal experience of chemosensory stimuli on subsequent olfactory and dietary preferences of cats as newborns, at 9-10 weeks, and at 6 months. Cats were exposed to vanillin or 4-ethylguaiacol via their mother's diet either prenatally, postnatally, perinatally (prenatal and postnatal), or experienced no exposure to the stimuli (control). Newborns were given a two-choice olfactory test between the familiar "odor" and no odor; 9-10 week olds were tested for their preference between two food treats, one flavored with the familiar stimulus and the other unflavored; at 6 months, cats were given a choice of two bowls of food, one flavored with the familiar stimulus and the other unflavored. At all ages, cats preferred the familiar, and avoided the unfamiliar, stimulus. Perinatal exposure exerted the strongest influence on preference. Prenatal exposure influenced preference at all ages and postnatal exposure exerted a stronger effect as the cat aged. We conclude that long-term chemosensory and dietary preferences of cats are influenced by prenatal and early (nursing) postnatal experience, supporting a natural and biologically relevant mechanism for the safe transmission of diet from mother to young. © The Author 2012. Published by Oxford University Press. All rights reserved.
Resumo:
The tropical marine sponge Acanthella cavernosa (Dendy) converts potassium [14C] cyanide to axisonitrile-3 (1); this precursor is also used for the synthesis of axisothiocyanate-3 (2) suggesting that isocyanides are precursors to isothiocyanates in A. cavernosa. Likewise, potassium [14C] thiocyanate is used for the synthesis of axisothiocyanate-3; unexpectedly this precursor also labelled axisonitrile-3. These results demonstrate either an interconversion between cyanide and thiocyanate prior to secondary metabolite formation or that the secondary metabolites can themselves be interconverted. Specimens of the dorid nudibranch Phyllidiellu pustulosa, preadapted to a diet of A. cavernosa, fed on 14C-labelled sponges and were subsequently found to contain the radioactive terpenes (1) and (2). Specimens of P. pustulosa, which had not expressed a dietary preference for A. cavernosa in the field, did not generally feed in aquarium tests with 14C-labelled sponges and, therefore, provided non-radioactive extracts. Since control experiments demonstrated the inability of P. pustulosa to synthesise the metabolites de novo, we therefore conclude that P. pustulosa acquires secondary metabolites by dietary transfer from A. cavernosa.
Resumo:
A shearing quotient (SQ) is a way of quantitatively representing the Phase I shearing edges on a molar tooth. Ordinary or phylogenetic least squares regression is fit to data on log molar length (independent variable) and log sum of measured shearing crests (dependent variable). The derived linear equation is used to generate an 'expected' shearing crest length from molar length of included individuals or taxa. Following conversion of all variables to real space, the expected value is subtracted from the observed value for each individual or taxon. The result is then divided by the expected value and multiplied by 100. SQs have long been the metric of choice for assessing dietary adaptations in fossil primates. Not all studies using SQ have used the same tooth position or crests, nor have all computed regression equations using the same approach. Here we focus on re-analyzing the data of one recent study to investigate the magnitude of effects of variation in 1) shearing crest inclusion, and 2) details of the regression setup. We assess the significance of these effects by the degree to which they improve or degrade the association between computed SQs and diet categories. Though altering regression parameters for SQ calculation has a visible effect on plots, numerous iterations of statistical analyses vary surprisingly little in the success of the resulting variables for assigning taxa to dietary preference. This is promising for the comparability of patterns (if not casewise values) in SQ between studies. We suggest that differences in apparent dietary fidelity of recent studies are attributable principally to tooth position examined.
Resumo:
Many insect species vary in their degree of foraging specialisation, with many bee species considered polyphagic (polylectic). Wild, non-managed bee species vary in their conservation status, and species-specific biological traits such as foraging specialisation may play an important role in determining variance in population declines. Current agri-environment schemes (AESs) prescribe the introduction of flower seed mixes for agricultural systems to aid the conservation of wild bees. However, the extent to which flower combinations adequately meet bee foraging requirements is poorly known. We quantitatively assessed pollen use and selectivity using two statistical approaches: Bailey's Intervals and Compositional Analysis, in an examplar species, a purportedly polylectic and rare bee, Colletes floralis, across 7 sites through detailed analysis of bee scopal pollen loads and flower abundance. Both approaches provided good congruence, but Compositional Analysis was more robust to small sample sizes. We advocate its use for the quantitative determination of foraging behaviour and dietary preference. Although C. floralis is polylectic, it showed a clear dietary preference for plants within the family Apiaceae. Where Apiaceae was uncommon, the species exploited alternative resources. Other plant families, such as the Apiaceae, could be included, or have their proportion increased in AES seed mixes, to aid the management of C. floralis and potentially other wild solitary bees of conservation concern. © 2011 The Authors. Insect Conservation and Diversity © 2011 The Royal Entomological Society.
Resumo:
The human fetus learns about its chemosensory environment and this influences its behavior at birth and during the nursing period. This study examined whether prenatal experience could influence behavior much later in life. The dietary preference of two groups of children (8- to 9-years old) was examined. Mothers of one group had consumed garlic during pregnancy, mothers of the control group had not. Children received two tests, 1 month apart, of a meal containing two portions of potato gratin, one flavored with garlic. The total amount of potato, and the percentage of garlic flavored potato, eaten was calculated and examined separately by ANOVA for factors of prenatal exposure, the child's sex, and trial. Children prenatally exposed to garlic ate significantly more garlic flavored potato and a significantly greater overall amount of potato on trial 2, compared to controls. The results demonstrate prenatal experience may affect behavior well into childhood. © 2012 Wiley Periodicals, Inc.
Resumo:
This exploratory study evaluated biophysical, cultural and socio-economic factors affecting crop production and land utilisation in the Nkonkobe Municipality, South Africa. The study sought to establish what farmers in the area perceive as serious threats to crop production, drivers for land abandonment, and how best current agricultural production could be intensified. The farmers’ perspectives were assessed through interviews using semi-structured and open-ended questionnaires. The results of the study revealed declining crop productivity and increase in land abandonment in the Municipality. The biophysical drivers of land abandonment were low and erratic rainfall and land degradation while the socio-economic drivers were labour shortages due to old age and youth movement to cities, lack of farming equipment and security concerns. The most abandoned crops were maize, sorghum and wheat. This trend was attributed to the labour intensiveness of cereal production and a shift in dietary preference to purchased rice. These findings should be factored in any programmes that seek to increase land utilisation and crop productivity in the Municipality.
Resumo:
A key challenge for humanity is how a future global population of 9 billion can all be fed healthily and sustainably. Here, we review how competition for land is influenced by other drivers and pressures, examine land-use change over the past 20 years and consider future changes over the next 40 years. Competition for land, in itself, is not a driver affecting food and farming in the future, but is an emergent property of other drivers and pressures. Modelling studies suggest that future policy decisions in the agriculture, forestry, energy and conservation sectors could have profound effects, with different demands for land to supply multiple ecosystem services usually intensifying competition for land in the future. In addition to policies addressing agriculture and food production, further policies addressing the primary drivers of competition for land (population growth, dietary preference, protected areas, forest policy) could have significant impacts in reducing competition for land. Technologies for increasing per-area productivity of agricultural land will also be necessary. Key uncertainties in our projections of competition for land in the future relate predominantly to uncertainties in the drivers and pressures within the scenarios, in the models and data used in the projections and in the policy interventions assumed to affect the drivers and pressures in the future.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
INTRODUCTION: The differential associations of beer, wine, and spirit consumption on cardiovascular risk found in observational studies may be confounded by diet. We described and compared dietary intake and diet quality according to alcoholic beverage preference in European elderly.
METHODS: From the Consortium on Health and Ageing: Network of Cohorts in Europe and the United States (CHANCES), seven European cohorts were included, i.e. four sub-cohorts from EPIC-Elderly, the SENECA Study, the Zutphen Elderly Study, and the Rotterdam Study. Harmonized data of 29,423 elderly participants from 14 European countries were analyzed. Baseline data on consumption of beer, wine, and spirits, and dietary intake were collected with questionnaires. Diet quality was assessed using the Healthy Diet Indicator (HDI). Intakes and scores across categories of alcoholic beverage preference (beer, wine, spirit, no preference, non-consumers) were adjusted for age, sex, socio-economic status, self-reported prevalent diseases, and lifestyle factors. Cohort-specific mean intakes and scores were calculated as well as weighted means combining all cohorts.
RESULTS: In 5 of 7 cohorts, persons with a wine preference formed the largest group. After multivariate adjustment, persons with a wine preference tended to have a higher HDI score and intake of healthy foods in most cohorts, but differences were small. The weighted estimates of all cohorts combined revealed that non-consumers had the highest fruit and vegetable intake, followed by wine consumers. Non-consumers and persons with no specific preference had a higher HDI score, spirit consumers the lowest. However, overall diet quality as measured by HDI did not differ greatly across alcoholic beverage preference categories.
DISCUSSION: This study using harmonized data from ~30,000 elderly from 14 European countries showed that, after multivariate adjustment, dietary habits and diet quality did not differ greatly according to alcoholic beverage preference.
Resumo:
Objectives Shift workers are prone to obesity and associated co-morbidities such as diabetes and cardiovascular disease. Sleep restriction associated with shift work results in dramatic endocrine and metabolic effects that predispose shift workers to these adverse health consequences. While sleep restriction has been associated with increased caloric intake, food preference may also play a key role in weight gain associated with shift work. This study examined the impact of an overnight simulated night shift on food preference. Methods Sixteen participants [mean 20.1, standard deviation (SD) 1.4 years; 8 women] underwent a simulated night shift and control condition in a counterbalanced order. On the following morning, participants were provided an opportunity for breakfast that included high- and low-fat food options (mean 64.8% and 6.4% fat, respectively). Results Participants ate significantly more high-fat breakfast items after the simulated night shift than after the control condition [167.3, standard error of the mean (SEM 28.7) g versus 211.4 (SEM 35.6) g; P=0.012]. The preference for high-fat food was apparent among the majority of individuals following the simulated night shift (81%), but not for the control condition (31%). Shift work and control conditions did not differ, however, in the total amount of food or calories consumed. Conclusions A simulated night shift leads to preference for high-fat food during a subsequent breakfast opportunity. These results suggest that food choice may contribute to weight-related chronic health problems commonly seen among night shift workers.
Resumo:
Objective We examined whether exposure to a greater number of fruits, vegetables, and noncore foods (ie, nutrient poor and high in saturated fats, added sugars, or added salt) at age 14 months was related to children’s preference for and intake of these foods as well as maternal-reported food fussiness and measured child weight status at age 3.7 years. Methods This study reports secondary analyses of longitudinal data from mothers and children (n=340) participating in the NOURISH randomized controlled trial. Exposure was quantified as the number of food items (n=55) tried by a child from specified lists at age 14 months. At age 3.7 years, food preferences, intake patterns, and fussiness (also at age 14 months) were assessed using maternal-completed, established questionnaires. Child weight and length/height were measured by study staff at both age points. Multivariable linear regression models were tested to predict food preferences, intake patterns, fussy eating, and body mass index z score at age 3.7 years adjusting for a range of maternal and child covariates. Results Having tried a greater number of vegetables, fruits, and noncore foods at age 14 months predicted corresponding preferences and higher intakes at age 3.7 years but did not predict child body mass index z score. Adjusting for fussiness at age 14 months, having tried more vegetables at age 14 months was associated with lower fussiness at age 3.7 years. Conclusions These prospective analyses support the hypothesis that early taste and texture experiences influence subsequent food preferences and acceptance. These findings indicate introduction to a variety of fruits and vegetables and limited noncore food exposure from an early age are important strategies to improve later diet quality.
Resumo:
- Objective To evaluate dietary intake impact outcomes up to 3.5 years after the NOURISH early feeding intervention (concealed allocation, assessor masked RCT). - Methods 698 first-time mothers with healthy term infants were allocated to receive anticipatory guidance on protective feeding practices or usual care. Outcomes were assessed at 2, 3.7 and 5 years (3.5 years post-intervention). Dietary intake was assessed by 24-hour recall and Child Dietary Questionnaire. Mothers completed a food preference questionnaire and Children’s Eating Behaviour Questionnaire. Linear mixed models assessed group, time and time x group effects. - Results There were no group or time x group effects for fruit, vegetables, discretionary food and non-milk sweetened beverages intake. Intervention children showed a higher preference for fruits (74.6% vs 69.0% liked, P<.001), higher Child Dietary Questionnaire score for fruit and vegetables (15.3 vs 14.5, target>18, P=0.03), lower food responsiveness (2.3 vs 2.4, of maximum 5, P=.04) and higher satiety responsiveness (3.1 vs 3.0, of maximum 5, P=.04). - Conclusions Compared to usual care, an early feeding intervention providing anticipatory guidance regarding positive feeding practices led to small improvements in child dietary score, food preferences and eating behaviours up to 5 years of age, but not in dietary intake measured by 24-hour recall.
Resumo:
Among the environmental factors that can affect food intake is the extent of dietary variety available in the environment. Numerous studies have demonstrated that variety in a meal can increase the amount of food consumed in humans, rats, and other species. A physiological mechanism that has been demonstrated to affect food intake is the gut peptide cholecystokinin (CCK) which is released from the upper small intestine during the ingestion of food. Peripherally administered CCK has a robust inhibitory effect on the intake of a single-food meal. Thus, dietary variety and CCK both affect meal size, with dietary variety increasing intake and CCK decreasing intake. This raises the question ofhow dietary variety and CCK might interact to affect meal size. Previous studies of CCK's effects have focused on situations in which only one food was available for consumption. However, in an animal's natural environment it would frequently occur that the animal would come across a number of foods either simultaneously or in quick succession, thus providing the animal access to a variety of foods during a meal. Accordingly, the effect ofCCK on food intake in single-food and multiple-food meals was examined. It was found that food intake was greater in multiple-food than in single-food meals provided that foods in the multiplefood meal were presented either simultaneously or in increasing order of preference. When foods in the multiple-food meal were presented in decreasing order of preference, intake was similar to that observed in single-food meals. In addition, it was found that CCK inhibited food intake in a dose-dependent manner, and that its effects on food intake were similar regardless of meal type. Therefore, the inhibitory effects ofCCK were not diminished when a variety of foods were available for consumption. Furthermore, the finding that CCK did not differentially affect the intake of the two types of meals does not provide support for the recent-foods hypothesis which postulates that CCK decreases food intake by reducing the palatability of only recently consumed foods. However, it is consistent with the all-foods hypothesis, which predicts that CCK reduces food intake by decreasing the palatability of all foods. The 600 ng/kg dose of the CCK^-antagonist lorglumide significantly antagonized the inhibitory effect of exogenous CCK on food intake, and the magnitude of this effect was similar for both types of meal. These results suggest that exogenous CCK inhibits food intake through the activation ofCCK^ receptors. However, when administered by itself, the 600^ig/kg dose of lorglumide did not increase food intake in either single-food or multiple-food meals, suggesting that peripheral endogenous CCK may not play a major role in the control of food intake.
Resumo:
The effect of dietary sodium restriction on perceived intensity of and preference for the taste of salt was evaluated in 76 adults, 25-49 years, with diastolic blood pressure between 79-90 mmHg. Participants were volunteers from clinical Hypertension Prevention Trials (HPT), at the University of California, Davis and the University of Minnesota, Minneapolis. Participants followed one of four HPT diets: 1600 mg Na+/day (NA, n=lS), 1600 mg Na+ plus 3200 mg K+/day (NK, n=lS), 1600 mg Na+/day plus energy restriction to achieve weight loss (NW, n=l3) and weight loss only (WT, n=l3). All participants attended regularly scheduled nutrition intervention meetings designed to help them achieve the HPT dietary goals. A fifth, no-intervention group, consisted of 20, no-diet-change controls CCN). Sodium, potassium and energy intakes were monitored by analysis of single, 24-hour food records and corresponding overnight urine specimens, obtained at baseline and after 12 and 24 weeks of intervention. Hedonic responses to sodium chloride in a prepared cream of green bean soup were assessed by two methods : 1) scaling of like/dislike for an NaCl concentration series on 10-cm graphie line scales and 2) ad libitum mixing of unsalted and salted soups to maximum level of liking. Salt content of the mixes was analyzed by sodium ion-selective electrode. The concentration series was also rated for perceived saltinessintensity on similar graphie line scales. Tests were conducted at baseline and after approximately 1, 3, 6, 8, 10, 13 and 24 weeks of intervention. Reduction in sodium intake and excretion in NA, NK and NW participants was accompanied by a shift in preference toward less saltiness in soup. The pattern of hedonic responses changed over time: scores for high NaCl concentrations decreased progressively while scores for low concentrations increased. Hedonic maxima shifted fran a concentration of 0.55% at the onset to 0.1-0.2% added NaCl at week 24. During the same time period, the preferred concentration of ad libitum mixes declined 50%. These shifts occurred independently of changes in saltiness intensity ratings, potassium or energy intakes, and were consistent across the two participating study sites. Like/dislike and sd. libitum responses were similar after 13 and 24 weeks of diet, as were measures of sodium intake and excretion. These findings suggest that after three months of sodium restriction, preference for salt had readjusted to a lower level, reflective of lower sodium intake. Mechanisms underlying the change in preference are unclear, but may include sensory, context, physiological as well as behavioral effects. In contrast, few changes were noted within WT and CN groups. The pattern of hedonic responses varied little in controls while the WT group showed increased liking for mid-range NaCl concentrations. Small, but significant fluctuations in ad libitum mix concentration occurred in both of these groups, but the differences appeared to be random rather than systematic. The results of this study indicate that preference for the taste of salt declines progressively toward a new baseline following reductions in sodium intake. These alterations may enhance maintenance of lowsodium diets for the treatment and prevention of hypertension. Further investigation is needed to establish the degree to which long-term compliance is contingent upon variation in salt taste preference.
Resumo:
BACKGROUND: Older hospital patients are considered to be at risk of malnutrition due to insufficient dietary intake. OBJECTIVE: To determine whether taste enhancement, using ingredients naturally high in umami compounds, increases preference and consumption of a meal by older hospital patients. METHODS: 31 patients (65–92 years) on elderly carewards in aUKNHS Trust hospital took part in a single-blinded preference and consumption study. They tasted two meats (control and enhanced, presented in balanced order) and stated their preference. At lunch, control and enhanced cottage pie and gravy were served concurrently; patients were asked to consume ad libitum and intake was measured. RESULTS: Taste enhanced meat was significantly preferred (P = 0.001). Although mean consumption was higher for the enhanced compared to control meal (137 g versus 119 g), with higher levels of energy (103 kcal versus 82 kcal) and protein (4.6 g versus 3.4 g) consumed; differences were not significant. CONCLUSIONS: Natural ingredients rich in umami taste compounds can successfully be used to increase preference of meat based meals by older hospital patients. Larger trials are needed to determine whether such increases in preference can significantly increase consumption.