902 resultados para FOODS
Resumo:
The means to detect the irradiation of food has been investigated for many years. In recent times radiolytic products, termed 2-alkylcyclobutanones (2-CBs), have been identified as excellent markers of irradiation in lipid-containing foods. An ELISA test was developed, which was capable of detecting a number of these compounds in irradiated chicken meat. A polyclonal antiserum was raised to a 2-CB containing a terminal carboxyl group conjugated to a carrier protein. This antiserum was highly specific for cyclobutanones containing C-10 and C-12 side chains. During assay validation the limit of detection of the assay was calculated to be 0.064 pg of 2-CB per gram of fat, within- and between-assay variations ranged from 6.7 to 18%. During experimental studies, chicken meat irradiated at doses ranging from 2.5 to 10 kGy were assayed and correctly identified as being treated. Quantitative comparisons between the ELISA and CC-MS revealed a good correlation (r(2) = 0.913) between the two methodologies in concentrations of 2-CB detected in irradiated samples.
Resumo:
After demonstrating the lack of effectiveness of standard antibiotics against the acquired antibiotic resistance of Bacillus cereus (NCTC 10989), Escherichia coli (NCTC 1186), and Staphylococcus aureus (ATCC 12715), we showed that the following natural substances were antibacterial against these resistant pathogens: cinnamon oil, oregano oil, thyme oil, carvacrol, (S)-perillaldehyde, 3,4-dihydroxybenzoic acid (beta-resorcylic acid), and 3,4-dihydroxyphenethylamine (dopamine). Exposure of the three pathogens to a dilution series of the test compounds showed that oregano oil was the most active substance. The oils and pure compounds exhibited exceptional activity against B. cereus vegetative cells, with oregano oil being active at nanogram, per milliliter levels. In contrast, activities against B. cereus spores were very low. Activities of the test compounds were in the following approximate order: oregano oil > thyme oil approximate to carvacrol > cinnamon oil > perillaldehyde > dopamine > beta-resorcylic acid. The order of susceptibilities of the pathogens to inactivation was as follows: B. cereus (vegetative) much greater than S. aureus approximate to E. coli much greater than B. cereus (spores). Some of the test substances may be effective against antibiotic-resistant bacteria in foods and feeds.
Resumo:
Regulatory authorities, the food industry and the consumer demand reliable determination of chemical contaminants present in foods. A relatively new analytical technique that addresses this need is an immunobiosensor based on surface plasmon resonance (SPR) measurements. Although a range of tests have been developed to measure residues in milk, meat, animal bile and honey, a considerable problem has been encountered with both serum and plasma samples. The high degree of non-specific binding of some sample components can lead to loss of assay robustness, increased rates of false positives and general loss of assay sensitivity. In this paper we describe a straightforward precipitation technique to remove interfering substances from serum samples to be analysed for veterinary anthelmintics by SPR. This technique enabled development of an assay to detect a wide range of benzimidazole residues in serum samples by immunobiosensor. The limit of quantification was below 5 ng/ml and coefficients of variation were about 2%.
Resumo:
Studies of individual nutrients or foods have revealed much about dietary influences on bone. Multiple food or nutrient approaches, such as dietary pattern analysis, could offer further insight but research is limited and largely confined to older adults. We examined the relationship between dietary patterns, obtained by a posteriori and a priori methods, and bone mineral status (BMS; collective term for bone mineral content (BMC) and bone mineral density (BMD)) in young adults (20-25 years; n 489). Diet was assessed by 7 d diet history and BMD and BMC were determined at the lumbar spine and femoral neck (FN). A posteriori dietary patterns were derived using principal component analysis (PCA) and three a priori dietary quality scores were applied (dietary diversity score (DDS), nutritional risk score and Mediterranean diet score). For the PCA-derived dietary patterns, women in the top compared to the bottom fifth of the 'Nuts and Meat' pattern had greater FN BMD by 0.074 g/cm(2) (P=0.049) and FN BMC by 0.40 g (P=0.034) after adjustment for confounders. Similarly, men in the top compared to the bottom fifth of the 'Refined' pattern had lower FN BMC by 0.41 g (P-0.049). For the a priori DDS, women in the top compared to the bottom third had lower FN BMD by 0.05 g/cm(2) after adjustments (P=0.052), but no other relationships with BMS were identified. In conclusion, adherence to a 'Nuts and Meat' dietary pattern may be associated with greater BMS in young women and a 'Refined' dietary pattern may be detrimental for bone health in young men.
Resumo:
Evidence is accumulating that vitamin D may be protective against carcinogenesis, although exceptions have been observed for some digestive tract neoplasms. The aim of the present study was to explore the association between dietary vitamin D and related nutrients and the risk of oesophageal adenocarcinoma and its precursor conditions, Barrett's oesophagus and reflux oesophagitis. In an all-Ireland case-control study conducted between March 2002 and July 2005, 218 oesophageal adenocarcinoma patients, 212 Barrett's oesophagus patients, 208 reflux oesophagitis patients and 252 population-based controls completed a 101-item FFQ, and provided lifestyle and demographic information. Multiple logistic regression analysis was applied to examine the association between dietary intake and disease risk. Oesophageal adenocarcinoma risk was significantly greater for individuals with the highest compared with the lowest tertile of vitamin D intake (OR 1·99, 95 % CI 1·03, 3·86; P for trend = 0·02). The direct association could not be attributed to a particular vitamin D food source. Vitamin D intake was unrelated to Barrett's oesophagus and reflux oesophagitis risk. No significant associations were observed for Ca or dairy intake and oesophageal adenocarcinoma, Barrett's oesophagus or reflux oesophagitis development. High vitamin D intake may increase oesophageal adenocarcinoma risk but is not related to reflux oesophagitis and Barrett's oesophagus. Ca and dairy product intake did not influence the development of these oesophageal lesions. These findings suggest that there may be population subgroups at an increased risk of oesophageal adenocarcinoma if advice to improve vitamin D intake from foods is implemented. Limited work has been conducted in this area, and further research is required.
Resumo:
In the UK vitamin B-12, deficiency occurs in approximately 20% of adults aged >65 years. This incidence is significantly higher than that among the general population. The reported incidence invariably depends on the criteria of deficiency used, and in fact estimates rise to 24% and 46% among free-living and institutionalised elderly respectively when methylmalonic acid is used as a marker of vitamin B-12 status. The incidence of, and the criteria for diagnosis of, deficiency have drawn much attention recently in the wake of the implementation of folic acid fortification of flour in the USA. This fortification strategy has proved to be extremely successful in increasing folic acid intakes pre-conceptually and thereby reducing the incidence of neural-tube defects among babies born in the USA since 1998. However, in successfully delivering additional folic acid to pregnant women fortification also increases the consumption of folic acid of everyone who consumes products containing flour, including the elderly. It is argued that consuming additional folic acid (as 'synthetic' pteroylglutamic acid) from fortified foods increases the risk of 'masking' megaloblastic anaemia caused by vitamin B-12 deficiency. Thus, a number of issues arise for discussion. Are clinicians forced to rely on megaloblastic anaemia as the only sign of possible vitamin B-12 deficiency? Is serum vitamin B-12 alone adequate to confirm vitamin B-12 deficiency or should other diagnostic markers be used routinely in clinical practice? Is the level of intake of folic acid among the elderly (post-fortification) likely to be so high as to cure or 'mask' the anaemia associated with vitamin B-12 deficiency?.
Resumo:
Committees worldwide have set almost identical folate recommendations for the prevention of the first occurrence of neural tube defects (NTDs). We evaluate these recommendations by reviewing the results of intervention studies that examined the response of red blood cell folate to altered folate intake. Three options are suggested to achieve the extra 400 mu g folic acid/d being recommended by the official committees: increased intake of folate-rich foods, dietary folic acid supplementation, and folic acid fortification of food. A significant increase in foods naturally rich in folates was shown to be a relatively ineffective means of increasing red blood cell folate status in women compared with equivalent intakes of folic acid-fortified food, presumably because the synthetic form of the vitamin is more stable and more bioavailable. Although folic acid supplements are highly effective in optimizing folate status, supplementation is not an effective strategy for the primary prevention of NTDs because of poor compliance. Thus, food fortification is seen by many as the only option likely to succeed. Mandatory folic acid fortification of grain products was introduced recently in the United States at a level projected to provide an additional mean intake of 100 mu g folic acid/d, but some feel that this policy does not go far enough. A recent clinical trial predicted that the additional intake of folic acid in the United States will reduce NTDs by >20%, whereas 200 mu g/d would be highly protective and is the dose also shown to be optimal in lowering plasma homocysteine, with possible benefits in preventing cardiovascular disease. Thus, an amount lower than the current target of an extra 400 mu g/d may be sufficient to increase red blood cell folate to concentrations associated with the lowest risk of NTDs, but further investigation is warranted to establish the optimal amount.
Resumo:
Background: Mandatory fortification of grain products with folic acid was introduced recently in the United States, a policy expected to result in a mean additional intake of 100 mu g/d. One way of predicting the effectiveness of this measure is to determine the effect of removing a similar amount of folic acid as fortified food from the diets of young women who had been electively exposed to chronic fortification.
Objective: The objective was to examine the effect on folate status of foods fortified with low amounts of folic acid.
Design: We investigated the changes in dietary intakes and in red blood cell and serum concentrations of folate in response to removing folic acid-fortified foods for 12 wk from the diets of women who reportedly consumed such foods at least once weekly (consumers).
Results: Consumers (n = 21) had higher total folate intakes (P = 0.002) and red blood cell folate concentrations (P = 0.023) than nonconsumers (women who consumed folic acid-fortified foods less than once weekly; n = 30). Of greater interest, a 12-wk intervention involving the exclusion of these foods resulted in a decrease in folate intake of 78 +/- 56 mu g/d (P < 0.001), which was reflected in a significant reduction in red blood cell folate concentrations (P < 0.05).
Conclusions: Cessation of eating folic acid-fortified foods resulted in removing 78 mu g folic acid/d from the diet. Over 12 wk this resulted in a lowering of red blood cell folate concentrations by 111 nmol/L (49 mu g/L). This magnitude of change in folate status in women can be anticipated as a result of the new US fortification legislation and is predicted to have a significant, although not optimal, effect in preventing neural tube defects.
Resumo:
Background Recommendations by the UK Department of Health suggest that protection from neural tube defects (NTD) can be achieved through intakes of an extra 400 mu g daily of folate/folic acid as natural food, foods fortified with folic acid, or supplements. The assumption is that all three routes of intervention would have equal effects on folate status.
Methods We assessed the effectiveness of these suggested routes of intervention in optimising folate status. 62 women were recruited from the University staff and students to take part in a 3-month intervention study. Participants were randomly assigned to one of the following five groups: folic acid supplement (400 mu g/day; I); folic-acid-fortified foods (an additional 400 mu g/day; II); dietary folate (an additional 400 mu g/day; III); dietary advice (IV), and control (V). Responses to intervention were assessed as changes in red-cell folate between preintervention and postintervention values.
Findings 41 women completed the intervention study. Red-cell folate concentrations increased significantly over the 3 months in the groups taking folic acid supplements (group I) or food fortified with folic acid (group II) only (p<0.01 for both groups). By contrast, although aggressive intervention with dietary folate (group III) or dietary advice (group IV) significantly increased intake of food folate (p<0.001 and p<0.05, respectively), there was no significant change in folate status.
Interpretation We have shown that compared with supplements and fortified food, consumption of extra folate as natural food folate is relatively ineffective at increasing folate status. We believe that advice to women to consume folate-rich foods as a means to optimise folate status is misleading.
Resumo:
Rice has been demonstrated to be one of the major contributors to inorganic arsenic (i-As) intake in humans. However, little is known about rice products as additional source of i-As exposure. In this study, misos, syrups and amazake (a fermented sweet rice drink) produced from rice, barley and millet were analysed for total arsenic (t-As) and a subset of samples were also analyzed for As speciation. Rice based products displayed a higher i-As content than those derived from barley and millet. Most of the t-As in the rice products studied was inorganic (63-83%), the remainder being dimethylarsinic acid. Those who regularly consume rice drinks and condiments, such as the Japanese population and those who follow health conscious diets based on the Japanese cuisine, could reach up to 23% of the World Health Organization's Provisional Tolerable Daily Intake of i-As, by only consuming these kinds of products. This study provides a wide appreciation of how i-As derived from rice based products enters the human diet and how this may be of concern to populations who are already exposed to high levels of i-As through consumption of foods such as rice and seaweed.
Resumo:
Selenium (Se) is an essential micronutrient for many organisms, including plants, animals and humans. As plants are the main source of dietary Se, plant Se metabolism is therefore important for Se nutrition of humans and other animals. However, the concentration of Se in plant foods varies between areas, and too much Se can lead to toxicity. As we discuss here, plant Se uptake and metabolism can be exploited for the purposes of developing high-Se crop cultivars and for plant-mediated removal of excess Se from soil or water. Here, we review key developments in the current understanding of Se in higher plants. We also discuss recent advances in the genetic engineering of Se metabolism, particularly for biofortification and phytoremediation of Se-contaminated environments.
Resumo:
High levels of As in groundwater commonly found in Bangladesh and other parts of Asia not only pose a risk via drinking water consumption but also a risk in agricultural sustainability and food safety. This review attempts to provide an overview of current knowledge and gaps related to the assessment and management of these risks, including the behaviour of As in the soil-plant system, uptake, phytotoxicity, As speciation in foods, dietary habits, and human health risks. Special emphasis has been given to the situation in Bangladesh, where groundwater via shallow tube wells is the most important source of irrigation water in the dry season. Within the soil-plant system, there is a distinct difference in behaviour of As under flooded conditions, where arsenite (AsIII) predominates, and under nonflooded conditions, where arsenate (AsV) predominates. The former is regarded as most toxic to humans and plants. Limited data indicate that As-contaminated irrigation water can result in a slow buildup of As in the topsoil. In some cases the buildup is reflected by the As levels in crops, in others not. It is not yet possible to predict As uptake and toxicity in plants based on soil parameters. It is unknown under what conditions and in what time frame As is building up in the soil. Representative phytotoxicity data necessary to evaluate current and future soil concentrations are not yet available. Although there are no indications that crop production is currently inhibited by As, long-term risks are clearly present. Therefore, with concurrent assessments of the risks, management options to further prevent As accumulation in the topsoil should already have been explored. With regard to human health, data on As speciation in foods in combination with food consumption data are needed to assess dietary exposure, and these data should include spatial and seasonal variability. It is important to control confounding factors in assessing the risks. In a country where malnutrition is prevalent, levels of inorganic As in foods should be balanced against the nutritional value of the foods. Regarding agriculture, As is only one of the many factors that may pose a risk to the sustainability of crop production. Other risk factors such as nutrient depletion and loss of organic matter also must be taken into account to set priorities in terms of research, management, and overall strategy.
Resumo:
Aims: This study assessed the efficacy of a school-based healthy lifestyle intervention (Sport for LIFE) for increasing physical activity, decreasing sedentary behaviour, reducing screen time behaviour, encouraging healthy attitudes and behaviour to nutrition, and reducing body mass index (BMI) in 8–9-year-old primary school children from lower socioeconomic backgrounds in Northern Ireland.
Methods: A non-randomised controlled trial of 416 children from 24 schools took part. Schools were randomly assigned to one of two groups, an intervention or control group with 12 schools in each group. The intervention group received a 12-week school-based programme based on social cognitive theory. At baseline and follow-up, groups completed questionnaires assessing physical activity, screen time behaviour and dietary patterns. On each occasion anthropometric assessments of height and weight were taken. Physical activity and sedentary behaviour were measured by accelerometry.
Results Significant effects were observed for vigorous, moderate and light activity for the intervention group at follow-up. Sedentary behaviour was significantly reduced for the intervention group but not for the control group. No significant effects of the intervention on BMI, screen time behaviour or attitudes to nutrition, with the exception of non-core foods, were shown.
Conclusions: The programme was effective in increasing physical activity and reducing sedentary behaviour, however no significant changes in screen time behaviour and attitude to nutrition, with the exception of non-core foods, were observed. Future research ideas are offered for tackling low levels of physical activity in children.
Resumo:
Aflatoxins are a family of fungal toxins that are carcinogenic to man and cause immunosuppression, cancer and growth reduction in animals. We conducted a cross-sectional study among 480 children (age 9 months to 5 years) across 4 agroecological zones (SS, NGS, SGS and CS) in Benin and Togo to identify the effect of aflatoxin exposure on child growth and assess the pattern of exposure. Prior reports on this study [Gong, Y.Y., Cardwell, K., Hounsa, A., Egal, S., Turner, Hall, A.J., Wild, C.P., 2002. Dietary aflatoxin exposure and impaired growth in young children from Benin and Togo: cross sectional study. British Medical Journal 325, 20-21, Gong, Y.Y., Egal, S., Hounsa, A., Turner, P.C., Hall, A.J., Cardwell, K., Wild, C.P., 2003. Determinants of aflatoxin exposure in young children from Benin and Togo, West Africa: the critical role of weaning and weaning foods. International Journal of Epidemiology, 32, 556-562] showed that aflatoxin exposure among these children is widespread (99%) and that growth faltering is associated with high blood aflatoxinalbumin adducts (AF-alb adducts), a measure of recent past exposure. The present report demonstrates that consumption of maize is an important source of aflatoxin exposure for the survey population. Higher AF-alb adducts were correlated with higher A. flavus (CFU) infestation of maize (p=0.006), higher aflatoxin contamination (ppb) of maize (p<0.0001) and higher consumption frequencies of maize (p=0.053). The likelihood of aflatoxin exposure from maize was particularly high in agro-ecological zones where the frequency of maize consumption (SGS and CS), the presence of allatoxin in maize (SGS) or the presence of A. flavus on maize (NGS and SGS) was relatively high. Socio-economic background did not affect the presence of A. flavus and aflatoxin in maize, but better maternal education was associated with lower frequencies of maize consumption among children from the northernmost agro-ecological zone (SS) (p=0.001). The impact of groundinit consumption on aflatoxin exposure was limited in this population. High AF-alb adduct levels were correlated with high prevalence of A. flavus and aflatoxin in groundinit, but significance was weak after adjustment for weaning status, agro-ecological zone and maternal socio-economic status (resp. p = 0.091 and p = 0.083). Ingestion of A. flavus and aflatoxin was high in certain agro-ecological zones (SS and SGS) and among the higher socio-economic strata due to higher frequencies of groundnut consumption. Contamination of groundnuts was similar across socio-economic and agroecological boundaries.
In conclusion, dietary exposure to aflatoxin from groundnut was less than from maize in young children from Benin and Togo. Intervention strategies that aim to reduce dietary exposure in this population need to focus on maize consumption in particular, but they should not ignore consumption of groundnuts. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Background Aflatoxins are fungal metabolites that frequently contaminate staple foods in much of sub-Saharan Africa, and are associated with increased risk of liver cancer and impaired growth in young children. We aimed to assess whether postharvest measures to restrict aflatoxin contamination of groundnut crops could reduce exposure in west African villages.
Methods We undertook an intervention study at subsistence farms in the lower Kindia region of Guinea. Farms from 20 villages were included, ten of which implemented a package of postharvest measures to restrict aflatoxin contamination of the groundnut crop; ten controls followed usual postharvest practices. We measured the concentrations of blood aflatoxin-albumin adducts from 600 people immediately after harvest and at 3 months and 5 months postharvest to monitor the effectiveness of the intervention.
Findings In control villages mean aflatoxin-albumin concentration increased postharvest (from 5.5 pg/mg [95% CI 4.7-6.1] immediately after harvest to 18.7 pg/mg [17.0-20.6] 5 months later). By contrast, mean aflatoxin-albumin concentration in intervention villages after 5 months of groundnut storage was much the same as that immediately postharvest (7.2 pg/mg [6.2-8.4] vs 8.0 pg/mg [7.0-9.2]). At 5 months, mean adduct concentration in intervention villages was less than 50% of that in control villages (8.0 pg/mg [7.2-9.2] vs 18.7 pg/mg [17.0-20.6], p<0.0001). About a third of the number of people had non-detectable aflatoxin-albumin concentrations at harvest. At 5 months, five (2%) people in the control villages had non-detectable adduct concentrations compared with 47 (20%) of those in the intervention group (p<0.0001). Mean concentrations of aflatoxin B1 in groundnuts in household stores in intervention and control villages were consistent with measurements of aflatoxin-albumin adducts.
Interpretation Use of low-technology approaches at the subsistence-farm level in sub-Saharan Africa could substantially reduce the disease burden caused by aflatoxin exposure.