934 resultados para Nutrition status
Resumo:
Background: Body cell mass (BCM) may be estimated in clinical practice to assess functional nutritional status, eg, in patients with anorexia nervosa. Interpretation of the data, especially in younger patients who are still growing, requires appropriate adjustment for size. Previous investigations of this general issue have addressed chemical rather than functional components of body composition and have not considered patients at the extremes of nutritional status, in whom the ability to make longitudinal comparisons is of particular importance. Objective: Our objective was to determine the power by which height should be raised to adjust BCM for height in women of differing nutritional status. Design: BCM was estimated by K-40 counting in 58 healthy women, 33 healthy female adolescents, and 75 female adolescents with anorexia nervosa. The relation between BCM and height was explored in each group by using log-log regression analysis. Results: The powers by which height should be raised to adjust BCM,A,ere 1.73. 1.73, and 2.07 in the women, healthy female adolescents, and anorexic female adolescents, respectively. A simplified version of the index, BCM/height(2), was appropriate for all 3 categories and was negligibly correlated with height. Conclusions: In normal-weight women, the relation between height and BCM is consistent with that reported previously between height and fat-free mass. Although the consistency of the relation between BCM and fat-free mass decreases with increasing weight loss, the relation between height and BCM is not significantly different between normal-weight and underweight women. The index BCM/height(2) is easy to calculate and applicable to both healthy and underweight women. This information may be helpful in interpreting body-composition data in clinical practice.
Resumo:
Malnutrition is a common problem in children with end-stage liver disease (ESLD), and accurate assessment of nutritional status is essential in managing these children. In a retrospective study, we compared nutritional assessment by anthropometry with that by body composition. We analyzed all consecutive measurements of total body potassium (TBK, n = 186) of children less than 3 years old with ESLD awaiting transplantation found in our database. The TBK values obtained by whole body counting of 40K were compared with reference TRK values of healthy children. The prevalence of malnutrition, as assessed by weight (weight Z score < -2) was 28%, which was significantly lower (chi-square test, p < 0.0001) than the prevalence of malnutrition (76%) assessed by TBK (< 90% of expected TRK for age). These results demonstrated that body weight underestimated the nutritional deficit and stressed the importance of measuring body composition as part of assessing nutritional status of children with ESLD.
Resumo:
Objective: To investigate measures aimed at defining the nutritional status of cystic fibrosis (CF) populations, this study compared standard anthropometric measurements and total body potassium (TBK) as indicators of malnutrition. Methods: Height, weight, and TBK measurements of 226 children with CF from Royal Children's Hospital, Brisbane, Australia, were analyzed. Z scores for height for age, weight for age, and weight for height were analyzed by means of the National Centre for Health Statistics reference. TBK was measured by means of whole body counting and compared with predicted TBK for age. Two criteria were evaluated with respect to malnutrition: (1) a z score < -2.0 and (2) a TBK for age <80% of predicted. Results: Males and females with CF had lower mean height-for-age and weight-for-age z scores than the National Centre for Health Statistics reference (P < .01), but mean weight-for-height z score was not significantly different. There were no significant gender differences. According to anthropometry, only 7.5% of this population were underweight and 7.6% were stunted. However, with TBK as an indicator of nutritional status, 29.9% of males and 22.0% of females were malnourished. Conclusion: There are large differences in the percentage of patients with CF identified as malnourished depending on whether anthropometry or body composition data are used as the nutritional indicator. At an individual level, weight-based indicators are not sensitive indicators of suboptimal nutritional status in CF, significantly underestimating the extent of malnutrition. Current recommendations in which anthropometry is used as the indicator of malnutrition in CF should be revised.
Resumo:
School canteens represent Australia's largest take-away food outlet. With changes in lifestyles and family roles, canteens are used increasingly as a source of food for students. The nutritional quality of foods offered can have a significant impact on the nutritional status of students both now, and in the future. The Australian Nutrition Foundation has been developing its work in the field of school canteens over the past six years. Perhaps its most significant contribution to improving the health of canteens has been the development of the "Food Selection Guidelines for Children and Adolescents". These Guidelines are used to assess foods most suitable for sale in school canteens and for purchasing food in boarding schools. Products meeting the Guidelines are added to the ANF Registered Product List which school canteens and kitchens use as a type of "buying guide". This project has been successfully piloted in Queensland and this year has been expanded to a national campaign.
Resumo:
In the seasonally dry tropics of northern Australia, breeder cows may lose up to 30% liveweight during the dry season when pasture is of low nutritive value. This is a major cause of low reproductive rates and high mortality. Weaning early in the dry season is effective to reduce this liveweight loss of the breeder (Holroyd et al. 1988). An experiment examined the dry season liveweight loss of breeders for a range of weaning times and levels of nutrition. From April to October through the dry season, 209 Bos indicus x Shorthorn cross cows 4-6 years of age grazed speargrass pastures in north Queensland. The cows had been joined with bulls from late January until April. Twenty-nine breeders had not suckled a calf during the previous wet season (DRY cows). In addition 180 cows lactating in April were weaned in late April, mid July or early September. The cows were allocated by stratified randomisation based on lactational status, stage of pregnancy and body condition to 15 x 40 ha paddocks. Five paddocks with low fertility soils provided LOW nutrition, while 10 paddocks with medium fertility soils and no supplementation or with supplementation provided MEDIUM and HIGH nutrition, respectively. The supplement consisted of molasses containing 14% urea offered ad libitum. Liveweight was measured at intervals and conceptus-free liveweight (CF-LW) calculated. Data were analyses by AOV within groups of paddocks. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
Urban encroachment on dense, coastal koala populations has ensured that their management has received increasing government and public attention. The recently developed National Koala Conservation Strategy calls for maintenance of viable populations in the wild. Yet the success of this, and other, conservation initiatives is hampered by lack of reliable and generally accepted national and regional population estimates. In this paper we address this problem in a potentially large, but poorly studied, regional population in the State that is likely to have the largest wild populations. We draw on findings from previous reports in this series and apply the faecal standing-crop method (FSCM) to derive a regional estimate of more than 59 000 individuals. Validation trials in riverine communities showed that estimates of animal density obtained from the FSCM and direct observation were in close agreement. Bootstrapping and Monte Carlo simulations were used to obtain variance estimates for our population estimates in different vegetation associations across the region. The most favoured habitat was riverine vegetation, which covered only 0.9% of the region but supported 45% of the koalas. We also estimated that between 1969 and 1995 -30% of the native vegetation associations that are considered as potential koala habitat were cleared, leading to a decline of perhaps 10% in koala numbers. Management of this large regional population has significant implications for the national conservation of the species: the continued viability of this population is critically dependent on the retention and management of riverine and residual vegetation communities, and future vegetation-management guidelines should be cognisant of the potential impacts of clearing even small areas of critical habitat. We also highlight eight management implications.
Resumo:
In previous experiments, increased leaf-Phosphorus (P) content with increasing P supply enhanced the individual leaf expansion and water content of fresh cotton leaves in a severely drying soil. In this paper, we report on the bulk water content of leaves and its components, free and bound water, along with other measures of plant water status, in expanding cotton leaves of various ages in a drying soil with different P concentrations. The bound water in living tissue is more likely to play a major role in tolerance to abiotic stresses by maintaining the structural integrity and/or cell wall extensibility of the leaves, whilst an increased amount of free water might be able to enhance solute accumulation, leading to better osmotic adjustment and tolerance to water stress, and maintenance of the volumes of sub-cellular compartments for expansive leaf growth. There were strong correlations between leaf-P%, leaf water (total, free and bound water) and leaf expansion rate (LER) under water stress conditions in a severely drying soil. Increased soil-P enhanced the uptake of P from a drying soil, leading to increased supply of osmotically active inorganic solutes to the cells in growing leaves. This appears to have led to the accumulation of free water and more bound water, ultimately leading to increased leaf expansion rates as compared to plants in low P soil under similar water stress conditions. The greater amount of bound and free water in the high-P plants was not necessarily associated with changes in cell turgor, and appears to have maintained the cell-wall properties and extensibility under water stressed conditions in soils that are nutritionally P-deficient.
Resumo:
Near infrared (NIR) spectroscopy, usually in reflectance mode, has been applied to the analysis of faeces to measure the concentrations of constituents such as total N, fibre, tannins and delta C-13. In addition, an unusual and exciting application of faecal NIR [F.NIR] analyses is to directly predict attributes of the diet of herbivores such as crude protein and fibre contents, proportions of plant species and morphological components, diet digestibility and voluntary DM intake. This is an unusual application of NIR spectroscopy insofar as the spectral measurements are made, not on the material of interest [i.e. the diet), but on a derived material (i.e. faeces). Predictions of diet attributes from faecal spectra clearly depend on there being sufficient NIR spectral information in the diet residues present in faeces to describe the diet, although endogenous components of faeces such as undigested debris of micro-organisms from the rumen and Large intestine and secretions into the gastrointestinal tract wilt also contribute spectral information. Spectra of forage and of faeces derived from the forage are generally similar and the observed differences are principally in the spectral regions associated with constituents of forages known to be of low, or of high, digestibility. Some diet components (for example, ureal which are likely to be entirely digested apparently cannot be predicted from faecal NIR spectra because they cannot contribute to faecal spectra except through modifying the microbial and endogenous components. The errors and robustness of F.NIR calibrations to predict the crude protein concentration and digestibility of the diet of herbivores are generally comparable with those to directly predict the same attributes in forage from NIR spectra of the forage. Some attributes of the animal, such as species, gender, pregnancy status and parasite burden have been successfully discriminated into classes based on their faecal NIR spectra. Such discrimination was likely associated with differences in the diet selected and/or differences in the metabolites excreted in the faeces. NIR spectroscopy of faeces has usually involved scanning dried and ground samples in monochromators in the 400-2500nm or 1100-2500nm ranges. Results satisfactory for the purpose have also been reported for dried and ground faeces scanned using a diode array instrument in the 800-1700nm range and for wet faeces and slurries of excreta scanned with monochromators. Chemometric analysis of faecal spectra has generally used the approaches established for forage analysis. The capacity to predict many attributes of the diet, and some aspects of animal physiology, from NIR spectra of faeces is particularly useful to study the quality and quantity of the diet selected by both domestic and feral grazing herbivores and to enhance production and management of both herbivores and their grazing environment.
Resumo:
Varying the spatial distribution of applied nitrogen (N) fertilizer to match demand in crops has been shown to increase profits in Australia. Better matching the timing of N inputs to plant requirements has been shown to improve nitrogen use efficiency and crop yields and could reduce nitrous oxide emissions from broad acre grains. Farmers in the wheat production area of south eastern Australia are increasingly splitting N application with the second timing applied at stem elongation (Zadoks 30). Spectral indices have shown the ability to detect crop canopy N status but a robust method using a consistent calibration that functions across seasons has been lacking. One spectral index, the canopy chlorophyll content index (CCCI) designed to detect canopy N using three wavebands along the "red edge" of the spectrum was combined with the canopy nitrogen index (CNI), which was developed to normalize for crop biomass and correct for the N dilution effect of crop canopies. The CCCI-CNI index approach was applied to a 3-year study to develop a single calibration derived from a wheat crop sown in research plots near Horsham, Victoria, Australia. The index was able to predict canopy N (g m-2) from Zadoks 14-37 with an r2 of 0.97 and RMSE of 0.65 g N m-2 when dry weight biomass by area was also considered. We suggest that measures of N estimated from remote methods use N per unit area as the metric and that reference directly to canopy %N is not an appropriate method for estimating plant concentration without first accounting for the N dilution effect. This approach provides a link to crop development rather than creating a purely numerical relationship. The sole biophysical input, biomass, is challenging to quantify robustly via spectral methods. Combining remote sensing with crop modelling could provide a robust method for estimating biomass and therefore a method to estimate canopy N remotely. Future research will explore this and the use of active and passive sensor technologies for use in precision farming for targeted N management.
Resumo:
Background: Malnutrition is a common problem for residents of nursing homes and long-term care hospitals. It has a negative influence on elderly residents and patients health and quality of life. Nutritional care seems to have a positive effect on elderly individuals nutritional status and well-being. Studies of Finnish elderly people s nutrition and nutritional care in institutions are scarce. Objectives: The primary aim was to investigate the nutritional status and its associated factors of elderly nursing home residents and long-term care patients in Finland. In particular, to find out, if the nursing or nutritional care factors are associated with the nutritional status, and how do carers and nurses recognize malnutrition. A further aim was to assess the energy and nutrient intake of the residents of dementia wards. A final objective was to find out, if the nutrition training of professionals leads to changes in their knowledge and further translate into better nutrition for the aged residents of dementia wards. Subjects and methods: The residents (n=2114) and patients (n=1043) nutritional status was assessed in all studies using the Mini Nutritional Assessment test (MNA). Information was gathered in a questionnaire on residents and patients daily routines providing nutritional care. Residents energy and nutrient intake (n=23; n=21) in dementia wards were determined over three days by the precise weighing method. Constructive learning theory was the basis for educating the professionals (n=28). A half-structured questionnaire was used to assess professionals learning. Studies I-IV were cross-sectional studies whereas study V was an intervention study. Results: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. According to the MNA, 11% to 57% of the studied elderly people suffered from malnutrition, and 40-89% were at risk of malnutrition, whereas only 0-16% had a good nutritional status. Resident- and patient-related factors such as dementia, impaired ADL (Activities of Daily Living), swallowing difficulties and constipation mainly explained the malnutrition, but also some nutritional care related factors, such as eating less than half of the offered food portion and not receiving snacks were also related to malnutrition. The intake of energy and some nutrients by the residents of dementia wards were lower than those recommended, although the offered food contained enough energy and nutrients. The proportion of residents receiving vitamin D supplementation was low, although there is a recommendation and known benefits for the adequate intake of vitamin D. Nurses recognized malnutrition poorly, only one in four (26.7%) of the actual cases. Keeping and analysing food diaries and reflecting on nutritional issues in small group discussions were effective training methods for professionals. The nutrition education of professionals had a positive impact on the energy and protein intake, BMIs, and the MNA scores of some residents in dementia wards. Conclusions: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. Although residents- and patient related factors mainly explained malnutrition, nurses recognized malnutrition poorly and nutritional care possibilities were in minor use. Professionals nutrition education had a positive impact on the nutrition of elderly residents. Further studies describing successful nutritional care and nutrition education of professionals are needed.
Resumo:
The low solubility of iron (Fe) depresses plant growth in calcareous soils. In order to improve Fe availability, calcareous soils are treated with synthetic ligands, such as ethylenediaminetetraacetic acid (EDTA) and ethylenediimi-nobis(2-hydroxyphenyl)acetic acid (EDDHA). However, high expenses may hinder their use (EDDHA), and the recalcitrance of EDTA against biodegra-dation may increase the potential of cadmium (Cd) and lead (Pb) leaching. This study evaluated the ability of biodegradable ligands, i.e. different stereo-isomers of ethylenediaminedisuccinic acid (EDDS), to provide Fe for lettuce (Lactuca sativa L.) and ryegrass (Lolium perenne cv. Prego), their effects on uptake of other elements and solubility in soils and their subsequent effects on the activity of oxygen-scavenging enzymes in lettuce. Both EDTA and EDDHA were used as reference ligands. In unlimed and limed quartz sand both FeEDDS(S,S) and a mixture of stereo-isomers of FeEDDS (25% [S,S]-EDDS, 25% [R,R]-EDDS and 50% [S,R]/[R,S]-EDDS), FeEDDS(mix), were as efficient as FeEDTA and FeEDDHA in providing lettuce with Fe. However, in calcareous soils only FeEDDS(mix) was comparable to FeEDDHA when Fe was applied twice a week to mimic drip irrigation. The Fe deficiency increased the manganese (Mn) concentration in lettuce in both acidic and alkaline growth media, whereas Fe chelates depressed it. The same was observed with zinc (Zn) and copper (Cu) in acidic growth media. EDDHA probably affected the hormonal status of lettuce as well and thus depressed the uptake of Zn and Mn even more. The nutrient concentrations of ryegrass were only slightly affected by the Fe availability. After Fe chelate splitting in calcareous soils, EDDS and EDTA increased the solubility of Zn and Cu most, but only the Zn concentration was increased in lettuce. The availability of Fe increased the activity of oxygen-scavenging enzymes (ascorbate peroxidase, guaiacol peroxidase, catalase). The activity of Cu/ZnSOD (Cu/Zn superoxide dismutase) and MnSOD in lettuce leaves followed the concentrations of Zn and Mn. In acidic quartz sand low avail-ability of Fe increased the cobalt (Co) and nickel (Ni) concentrations in let-tuce, but Fe chelates decreased them. EDTA increased the solubility of Cd and Pb in calcareous soils, but not their uptake. The biodegradation of EDDS was not affected by the complexed element, and [S,S]-EDDS was biodegraded within 28 days in calcareous soils. EDDS(mix) was more recalcitrant, and after 56 days of incubation water-soluble elements (Fe, Mn, Zn, Cu, Co, Ni, Cd and Pb) corresponded to 10% of the added EDDS(mix) concentration.
Resumo:
This study examined the nutritional composition of the intertidal marine polychaete Perinereis helleri (Nereididae)when artificially cultured in sand filters treating mariculture wastewater. Moisture levels in harvested P. helleri ranged from 758 to 855 g kg1, and ash, from 23 to 61 g kg1 wet matter (WM). Stocking density and graded size after harvest significantly affected their composition. Higher total lipid contents were found in large (>0.6 g) P. helleri(16–19 g kg1 WM) and those grown at the lowest density(1000 m2: 18 g kg 1 WM) than in small (≤0.6 g) ones (14 g kg1 WM) and those grown at the highest densities (4000–6000 m2: 13–16 g kg1 WM). Several fatty acids within a very broad profile (some 30 identified) reflected this pattern, yet their ARA/EPA/DHA ratios were relatively unaffected. Feeding the polychaete-assisted sand filters (PASF) with fish meal to increase worm biomass productivity significantly increased their DHA content. Other components (e.g. protein, phospholipids, cholesterol, carbohydrate, amino acids, nitrogen, minerals and bromophenols) and nutritional factors (e.g. maturity, feeding seaweed and endemic shrimp viral content) were also investigated. Results suggest that PASF-produced P. helleri have a well-balanced nutritional profile for penaeid shrimp and fish broodstock.
Resumo:
Objective To examine the combined effects of physical activity and weight status on blood pressure (BP) in preschool-aged children. Study design The sample included 733 preschool-aged children (49% female). Physical activity was objectively assessed on 7 consecutive days by accelerometry. Children were categorized as sufficiently active if they met the recommendation of at least 60 minutes daily of moderate-to-vigorous physical activity (MVPA). Body mass index was used to categorize children as nonoverweight or overweight/obese, according to the International Obesity Task Force benchmarks. BP was measured using an automated BP monitor and categorized as elevated or normal using BP percentile-based cut-points for age, sex, and height. Results The prevalence of elevated systolic BP (SBP) and diastolic BP was 7.7% and 3.0%, respectively. The prevalence of overweight/obese was 32%, and about 15% of children did not accomplish the recommended 60 minutes of daily MVPA. After controlling for age and sex, overweight/obese children who did not meet the daily MVPA recommendation were 3 times more likely (OR 3.8; CI 1.6-8.6) to have elevated SBP than nonoverweight children who met the daily MVPA recommendation. Conclusions Overweight or obese preschool-aged children with insufficient levels of MVPA are at significantly greater risk for elevated SBP than their nonoverweight and sufficiently active counterparts.
Resumo:
Background Malnutrition and unintentional weight loss are major clinical issues in people with dementia living in residential aged care facilities (RACFs) and are associated with serious adverse outcomes. However, evidence regarding effective interventions is limited and strategies to improve the nutritional status of this population are required. This presentation describes the implementation and results of a pilot randomised controlled trial of a multi-component intervention for improving the nutritional status of RACF residents with dementia. Method Fifteen residents with moderate-severe dementia living in a secure long-term RACF participated in a five week pilot study. Participants were randomly allocated to either an Intervention (n=8) or Control group (n=7). The intervention comprised four elements delivered in a separate dining room at lunch and dinner: the systematic reinforcement of residents’ eating behaviors using a specific communication protocol; family-style dining; high ambiance table presentation; and routine Dietary-Nutrition Champion supervision. Control group participants ate their meals according to the facility’s standard practice. Baseline and follow-up assessments of nutritional status, food consumption, and body mass index were obtained by qualified nutritionists. Additional assessments included measures of cognitive functioning, mealtime agitation, depression, wandering status and multiple measures of intervention fidelity. Results No participant was malnourished at study commencement and participants in both groups gained weight from follow-up to baseline which was not significantly different between groups (t=0.43; p=0.67). A high degree of treatment fidelity was evident throughout the intervention. Qualitative data from staff indicate the intervention was perceived to be beneficial for residents. Conclusions This multi-component nutritional intervention was well received and was feasible in the RACF setting. Participants’ sound nutritional status at baseline likely accounts for the lack of an intervention effect. Further research using this protocol in malnourished residents is recommended. For success, a collaborative approach between researchers and facility staff, particularly dietary staff, is essential.