941 resultados para food based dietary guidelines


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the years 2004 and 2005, we collected samples of phytoplankton, zooplankton, and macroinvertebrates in an artificial small pond in Budapest (Hungary). We set up a simulation model predicting the abundances of the cyclopoids, Eudiaptomus zachariasi, and Ischnura pumilio by considering only temperature and the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature but the abundances of the three mentioned groups. When we ran the model with the data series of internationally accepted climate change scenarios, the different outcomes were discussed. Comparative assessment of the alternative climate change scenarios was also carried out with statistical methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To frail elders, the consumption of nutrients is a challenge. Individuals participating in home delivered meal programs achieve higher daily intakes of key nutrients over non participants. However many elders require special diet and modifications in food texture, features generally not provided by such programs.^ This study compared a pilot program providing individual homemaker prepared meals based on special needs with the traditional home delivered meal program. Eight participants in the model group and 19 participants in the control group completed the 180 day study. The high drop out rate in the model group was due to scheduling difficulties.^ No significant differences existed between the two groups with respect to weight gain/maintenance, macronutrient intake, intake of most micronutrients and customer satisfaction. Homemaker prepared meals is a viable option for individuals needing dietary adjustments. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Diet and physical activity patterns have been implicated as major factors in the increasing prevalence of childhood and adolescent obesity. It is estimated that between 16 and 33 percent of children and adolescents in the United States are overweight (CDC, 2000). Moreover, the CDC estimates that less than 50% of adolescents are physically active on a regular basis (CDC, 2003). Interventions must be focused to modify these behaviors. Facilitating the understanding of proper nutrition and need for physical activity among adolescents is the first step in preventing overweight and obesity and delaying the development of chronic diseases later in life (Dwyer, 2000). The purpose of this study was to compare the outcomes of students receiving one of two forms of education (both emphasizing diet and physical activity), to determine whether a computer based intervention (CBI) program using an interactive, animated CD-ROM would elicit a greater behavior change in comparison to a traditional didactic intervention (TDI) program. A convenience sample of 254 high school students aged 14-19 participated in the 6-month program. A pre-test post-test design was used, with follow-up measures taken at three months post-intervention. ^ No change was noted in total fat, saturated fat, fruit/vegetables, or fiber intake for any of the groups. There was also no change in perceived self-efficacy or perceived social support. Results did, however, indicate an increase in nutrition knowledge for both intervention groups (p<0.001). In addition, the CBI group demonstrated more positive and sustained behavior changes throughout the course of the study. These changes included a decrease in BMI (ppre/post<0.001, ppost/follow-up<0.001), number of meals skipped (ppre/post<0.001), and soda consumption (ppre/post=0.003, ppost/follow-up=0.03) and an increase in nutrition knowledge (ppre/post<0.001, ppre/follow-up <0.001), physical activity (ppre/post<0.05, p pre/follow-up<0.01), frequency of label reading (ppre/follow-up <0.0l) and in dairy consumption (ppre/post=0.03). The TDI group did show positive gains in some areas post intervention, however a return to baseline behavior was shown at follow-up. Findings of this study suggest that compared to traditional didactic teaching, computer-based nutrition and health education has greater potential to elicit change in knowledge and behavior as well as promote maintenance of the behavior change over time. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study examined the relationships among dietary intake, substance use, socioeconomic and acculturation-related factors among Latinas in Miami-Dade County. Substance abuse is rising among Latinas. A fuller understanding of this problem is needed given the rise of the Hispanic population and the role of women in Latin society. A better understanding between substance use and dietary intake can guide nutrition interventions to reduce negative substance-related health consequences. A purposeful sample of 320 Latina mother/daughter dyads were recruited and interviewed face-to-face as part of the Latino Women's Study. Dietary intake was collected via a 24-hour recall and examined by (1) nutrient intake, (2) dietary patterns using cluster analysis, (3) quality of diet using the Healthy Eating Index (HEI) and (4) the Dietary Reference Intakes to determine nutrient adequacy. Substance use was measured with the Drug Use Frequency and the Healthy and Daily Living Form. Acculturation was measured with the Cultural Identity Scale. Three dietary patterns emerged based on the number of servings from the food groups established in MyPyramid. None were associated with substance use. Latinas who reported using cannabis, cocaine, sedatives without prescription and/or more than five alcoholic drinks on an occasion at least once a month during the previous twelve months had significantly lower HEI scores (64 vs. 60; F = 7.8, p = .005) and consumed fewer fruits (F = 16, p < .001) than non-users. Latinas classified as mothers whom reported consuming cannabis at least 1-7 times a week had significantly lower HEI scores (F = 4.23, p = .015, η2 = .027) than daughters with the same frequency of substance use. One dimension of acculturation, greater familiarity with Latin culture, was associated with good dietary quality (β = .142, p = .012) regardless of any type of substance used or income level. There was a high prevalence of inadequacy of folic acid intake (50-75%) regardless of substance use. Substance users consumed significantly more energy (1,798 vs. 1,615; p = .027) than non-users. Although effect sizes were small, associations between dietary intake and substance use among Latinas deserve further exploration while acknowledging the combined association with acculturation. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Stable isotope analysis has emerged as one of the primary means for examining the structure and dynamics of food webs, and numerous analytical approaches are now commonly used in the field. Techniques range from simple, qualitative inferences based on the isotopic niche, to Bayesian mixing models that can be used to characterize food-web structure at multiple hierarchical levels. We provide a comprehensive review of these techniques, and thus a single reference source to help identify the most useful approaches to apply to a given data set. We structure the review around four general questions: (1) what is the trophic position of an organism in a food web?; (2) which resource pools support consumers?; (3) what additional information does relative position of consumers in isotopic space reveal about food-web structure?; and (4) what is the degree of trophic variability at the intrapopulation level? For each general question, we detail different approaches that have been applied, discussing the strengths and weaknesses of each. We conclude with a set of suggestions that transcend individual analytical approaches, and provide guidance for future applications in the field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Component-based Software Engineering (CBSE) and Service-Oriented Architecture (SOA) became popular ways to develop software over the last years. During the life-cycle of a software system, several components and services can be developed, evolved and replaced. In production environments, the replacement of core components, such as databases, is often a risky and delicate operation, where several factors and stakeholders should be considered. Service Level Agreement (SLA), according to ITILv3’s official glossary, is “an agreement between an IT service provider and a customer. The agreement consists on a set of measurable constraints that a service provider must guarantee to its customers.”. In practical terms, SLA is a document that a service provider delivers to its consumers with minimum quality of service (QoS) metrics.This work is intended to assesses and improve the use of SLAs to guide the transitioning process of databases on production environments. In particular, in this work we propose SLA-Based Guidelines/Process to support migrations from a relational database management system (RDBMS) to a NoSQL one. Our study is validated by case studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Many breast cancer survivors continue to have a broad range of physical and psychosocial problems after breast cancer treatment. As cancer centres move forward with earlier discharge of stable breast cancer survivors to primary care follow-up it is important that comprehensive evidence-based breast cancer survivorship care is implemented to effectively address these needs. Research suggests primary care providers are willing to provide breast cancer survivorship care but many lack the knowledge and confidence to provide evidence-based care. Purpose The overall purpose of this thesis was to determine the challenges, strengths and opportunities related to implementing comprehensive evidence-based breast cancer survivorship guidelines by primary care physicians and nurse practitioners in southeastern Ontario. Methods This mixed-methods research was conducted in three phases: (1) synthesis and appraisal of clinical practice guidelines relevant to provision of breast cancer survivorship care within the primary care practice setting; (2) a brief quantitative survey of primary care providers to determine actual practices related to provision of evidence-based breast cancer survivorship care; and (3) individual interviews with primary care providers about the challenges, strengths and opportunities related to provision of comprehensive evidence-based breast cancer survivorship care. Results and Conclusions In the first phase, a comprehensive clinical practice framework was created to guide provision of breast cancer survivorship care and consisted of a one-page checklist outlining breast cancer survivorship issues relevant to primary care, a three-page summary of key recommendations, and a one-page list of guideline sources. The second phase identified several knowledge and practice gaps, and it was determined that guideline implementation rates were higher for recommendations related to prevention and surveillance aspects of survivorship care and lowest related to screening for and management of long-term effects. The third phase identified three major challenges to providing breast cancer survivorship care: inconsistent educational preparation, provider anxieties, and primary care burden; and three major strengths or opportunities to facilitate implementation of survivorship care guidelines: tools and technology, empowering survivors, and optimizing nursing roles. A better understanding of these challenges, strengths and opportunities will inform development of targeted knowledge translation interventions to provide support and education to primary care providers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is an established relationship between salt intake and risk of high blood pressure (BP). High blood pressure (hypertension) is a risk factor for cardiovascular disease (CVD) and scientific evidence shows that a high salt intake can contribute to the development of elevated blood pressure. The Scientific Advisory Committee on Nutrition (SACN) recommend a target reduction in the average salt intake of the population to no more than 6g per day. This figure has been adopted by the UK government as the recommended maximum salt intake for adults and children aged 11 years and over. Following publication of the SACN report in 2003, the government began a programme of reformulation work with the food industry aimed at reducing the salt content of processed food products. Voluntary salt reduction targets were first set in 2006, and subsequently in 2009, 2011 and 2014, for a range of food categories that contribute the most to the population’s salt intakes. Population representative urinary sodium data were collected in England in 2005-06, 2008 (UK), 2011 and 2014. In the latest survey assessment, estimated salt intake of adults aged 19 to 64 years in England was assessed from 24-hour urinary sodium excretion of 689 adults, selected to be representative of this section of the population. Estimated salt intake was calculated using the equation 17.1mmol of sodium = 1g of salt and assumes all sodium was derived from salt. The data were validated as representing daily intake by checking completeness of the urine collections by the para-amino benzoic acid (PABA) method. Urine samples were collected over five months (May to September) in 2014, concurrently with a similar survey in Scotland. This report presents the results for the latest survey assessment (2014) and a new analysis of the trend in estimated salt intake over time. The trend analysis is based on data for urinary sodium excretion from this survey and previous sodium surveys (including data from the National Diet and Nutrition Survey Rolling Programme (NDNS RP) Years 1 to 5) carried out in England over the last ten years, between 2005-06 and 2014. This data has been adjusted to take account of biases resulting from differences between surveys in laboratory analytical methods used for sodium. The analysis provides a revised assessment of the trend in estimated salt intake over time. The trend analysis in this report supersedes the trend analysis published in the report of the 2011 England urinary sodium survey.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Food safety has always been a social issue that draws great public attention. With the rapid development of wireless communication technologies and intelligent devices, more and more Internet of Things (IoT) systems are applied in the food safety tracking field. However, connection between things and information system is usually established by pre-storing information of things into RFID Tag, which is inapplicable for on-field food safety detection. Therefore, considering pesticide residue is one of the severe threaten to food safety, a new portable, high-sensitivity, low-power, on-field organophosphorus (OP) compounds detection system is proposed in this thesis to realize the on-field food safety detection. The system is designed based on optical detection method by using a customized photo-detection sensor. A Micro Controller Unit (MCU) and a Bluetooth Low Energy (BLE) module are used to quantize and transmit detection result. An Android Application (APP) is also developed for the system to processing and display detection result as well as control the detection process. Besides, a quartzose sample container and black system box are also designed and made for the system demonstration. Several optimizations are made in wireless communication, circuit layout, Android APP and industrial design to realize the mobility, low power and intelligence.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.