743 resultados para Nutrition Surveys
Resumo:
Diarrhoea is a common complication observed in critically ill patients. Relationships between diarrhoea, enteral nutrition and aerobic intestinal microflora have been disconnectedly examined in this patient cohort. This research used a two-study, observational design to examine these associations. Higher diarrhoea incidence rates were observed when patients received enteral tube feeding, had abnormal serum blood results, received multiple medications and had aerobic microflora dysbiosis. Further, significant aerobic intestinal microflora changes were observed over time in patients who experienced diarrhoea. These results establish a platform for further work to improve the intestinal health of critically ill patients.
Resumo:
Background: Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. Materials and Methods: The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Results: Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. Conclusion: The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
The evidence for nutritional support in COPD is almost entirely based on oral nutritional supplements (ONS) yet despite this dietary counseling and food fortification (DA) are often used as the first line treatment for malnutrition. This study aimed to investigate the effectiveness of ONS vs. DA in improving nutritional intake in malnourished outpatients with COPD. 70 outpatients (BMI 18.4 SD 1.6 kg/m2, age 73 SD 9 years, severe COPD) were randomised to receive a 12-week intervention of either ONS or DA (n 33 ONS vs. n 37 DA). Paired t-test analysis revealed total energy intakes significantly increased with ONS at week 6 (+302 SD 537 kcal/d; p = 0.002), with a slight reduction at week 12 (+243 SD 718 kcal/d; p = 0.061) returning to baseline levels on stopping supplementation. DA resulted in small increases in energy that only reached significance 3 months post-intervention (week 6: +48 SD 623 kcal/d, p = 0.640; week 12: +157 SD 637 kcal/d, p = 0.139; week 26: +247 SD 592 kcal/d, p = 0.032). Protein intake was significantly higher in the ONS group at both week 6 and 12 (ONS: +19.0 SD 25.0 g/d vs. DA: +1.0 SD 13.0 g/d; p = 0.033 ANOVA) but no differences were found at week 26. Vitamin C, Iron and Zinc intakes significantly increased only in the ONS group. ONS significantly increased energy, protein and several micronutrient intakes in malnourished COPD patients but only during the period of supplementation. Trials investigating the effects of combined nutritional interventions are required.
Resumo:
Background: It is important to identify patients who are at risk of malnutrition upon hospital admission as malnutrition results in poor outcomes such as longer length of hospital stay, readmission, hospitalisation cost and mortality. The aim of this study was to determine the prognostic validity of 3-Minute Nutrition Screening (3-MinNS) in predicting hospital outcomes in patients admitted to an acute tertiary hospital through a list of diagnosis-related groups (DRG). Methods: In this study, 818 adult patients were screened for risk of malnutrition using 3-MinNS within 24 hours of admission. Mortality data was collected from the National Registry with other hospitalisation outcomes retrieved from electronic hospital records. The results were adjusted for age, gender and ethnicity, and matched for DRG. Results: Patients identified to be at risk of malnutrition (37%) using 3-MinNS had significant positive association with longer length of hospital stay (6.6 ± 7.1 days vs. 4.5 ± 5.5 days, p<0.001), higher hospitalisation cost (S$4540 ± 7190 vs. S$3630 ± 4961, p<0.001) and increased mortality rate at 1 year (27.8% vs. 3.9%), 2 years (33.8% vs. 7.2%) and 3 years (39.1% vs. 10.5%); p<0.001 for all. Conclusions: The 3-MinNS is able to predict clinical outcomes and can be used to screen newly admitted patients for nutrition risk so that appropriate nutrition assessment and early nutritional intervention can be initiated.
Resumo:
Self-care management is needed for effective management of chronic kidney disease. The main aim for treatment or management of chronic kidney disease is to delay the worsening of kidney function, and to prevent or to manage the co-morbidities. Selfcare management is not easy, and patients will face many challenges, especially when they cannot get use to the new treatment plan. One of the challenges they face is dietary restriction, which is a very important aspect in any self-care management programme. Chronic kidney disease patients require a low-protein, low-sodium, low-potassium, and low-phosphorus diet. There are several strategies patients can undertake to ensure adherence, such as self-monitoring their dietary habits and type of food consumed using a food diary; involving social support, such as family members and spouse to help them to adhere to their diet restrictions; setting goals and providing positive reinforcement when they achieved the targeted goals; joining self-management programmes to equip themselves with the necessary skills so that they can better adhere to the treatment regimes, including diet restriction; and lastly, having the knowledge about their regime, and using this knowledge to help them understand and improve their adherence.
Resumo:
We conducted surveys of bats in China between 1999 and 2007, resulting in the identification of at least 62 species. In this paper we present data on 19 species, comprising 12 species from the family Rhinolophidae and seven from the Hipposideridae. Rhinolophids captured were Rhinolophus affinis, R. ferrumequinum, R. lepidus, R. luctus, R. macrotis, R. siamensis, R. marshalli, R. rex, R. pearsonii, R. pusillus, R. sinicus and R. stheno. Because of extensive morphological similarities we question the species distinctiveness of R. osgoodi (may be conspecific with R. lepidus), R. paradoxolophus (which may best be treated as a subspecies of R. rex), R. huananus (probably synonymous with R. siamensis), and we are skeptical as to whether R. sinicus is distinct from R. thomasi. Hipposiderids captured were Hipposideros armiger, H. cineraceus, H. larvatus, H. pomona, H. pratti, Aselliscus stoliczkanus and Coelops frithii. Of these species, two rhinolophids (Rhinolophus marshalli and R. stheno) and one hipposiderid (Hipposideros cineraceus) represent new species records for China. We present data on species' ranges, morphology and echolocation call frequencies, as well as some notes on ecology and conservation status. China hosts a considerable diversity of rhinolophid and hipposiderid bats, yet threats to their habitats and populations are substantial.
Resumo:
Objectives: To assess socio-economic differences in three components of nutrition knowledge, i.e. knowledge of (i) the relationship between diet and disease, (ii) the nutrient content of foods and (iii) dietary guideline recommendations; furthermore, to determine if socio-economic differences in nutrition knowledge contribute to inequalities in food purchasing choices. Design: The cross-sectional study considered household food purchasing,nutrition knowledge, socio-economic and demographic information. Household food purchasing choices were summarised by three indices, based on self-reported purchasing of sixteen groceries, nineteen fruits and twenty-one vegetables. Socio-economic position (SEP) was measured by household income and education. Associations between SEP, nutrition knowledge and food purchasing were examined using general linear models adjusted for age, gender, household type and household size. Setting: Brisbane, Australia in 2000. Subjects: Main household food shoppers (n 1003, response rate 66?4 %), located in fifty small areas (Census Collectors Districts). Results: Shoppers in households of low SEP made food purchasing choices that were less consistent with dietary guideline recommendations: they were more likely to purchase grocery foods comparatively higher in salt, sugar and fat, and lower in fibre, and they purchased a narrower range of fruits and vegetables. Those of higher SEP had greater nutrition knowledge and this factor attenuated most associations between SEP and food purchasing choices. Among nutrition knowledge factors, knowledge of the relationship between diet and disease made the greatest and most consistent contribution to explaining socio-economic differences in food purchasing. Conclusions: Addressing inequalities in nutrition knowledge is likely to reduce socio-economic differences in compliance with dietary guidelines. Improving knowledge of the relationship between diet and disease appears to be a particularly relevant focus for health promotion aimed to reduce socio-economic differences in diet and related health inequalities.
Resumo:
Repeatable and accurate seagrass mapping is required for understanding seagrass ecology and supporting management decisions. For shallow (< 5 m) seagrass habitats, these maps can be created by integrating high spatial resolution imagery with field survey data. Field survey data for seagrass is often collected via snorkelling or diving. However, these methods are limited by environmental and safety considerations. Autonomous Underwater Vehicles (AUVs) are used increasingly to collect field data for habitat mapping, albeit mostly in deeper waters (>20 m). Here we demonstrate and evaluate the use and potential advantages of AUV field data collection for calibration and validation of seagrass habitat mapping of shallow waters (< 5 m), from multispectral satellite imagery. The study was conducted in the seagrass habitats of the Eastern Banks (142 km2), Moreton Bay, Australia. In the field, georeferenced photos of the seagrass were collected along transects via snorkelling or an AUV. Photos from both collection methods were analysed manually for seagrass species composition and then used as calibration and validation data to map seagrass using an established semi-automated object based mapping routine. A comparison of the relative advantages and disadvantages of AUV and snorkeller collected field data sets and their influence on the mapping routine was conducted. AUV data collection was more consistent, repeatable and safer in comparison to snorkeller transects. Inclusion of deeper water AUV data resulted in mapping of a larger extent of seagrass (~7 km2, 5 % of study area) in the deeper waters of the site. Although overall map accuracies did not differ considerably, inclusion of the AUV data from deeper water transects corrected errors in seagrass mapped at depths to 5 m, but where the bottom is visible on satellite imagery. Our results demonstrate that further development of AUV technology is justified for the monitoring of seagrass habitats in ongoing management programs.
Resumo:
The preservation technique of drying offers a significant increase in the shelf life of food materials, along with the modification of quality attributes due to simultaneous heat and mass transfer. Variations in porosity are just one of the microstructural changes that take place during the drying of most food materials. Some studies found that there may be a relationship between porosity and the properties of dried foods. However, no conclusive relationship has yet been established in the literature. This paper presents an overview of the factors that influence porosity, as well as the effects of porosity on dried food quality attributes. The effect of heat and mass transfer on porosity is also discussed along with porosity development in various drying methods. After an extensive review of the literature concerning the study of porosity, it emerges that a relationship between process parameters, food qualities, and sample properties can be established. Therefore, we propose a hypothesis of relationships between process parameters, product quality attributes, and porosity.
Resumo:
Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.
Resumo:
Optimal bone metabolism is the result of hormonal, nutritional, and mechanical harmony, and a deficit in one area is usually impossible to overcome by improvements in others. Exercise during growth influences bone modeling locally at the regions being loaded, whereas calcium is thought to act systemically to influence bone remodeling. Despite acting through different mechanisms, a growing body of research suggests that exercise and calcium may not operate independently. Low dietary calcium intake or reduced bioavailability may minimize the adaptive response to exercise-induced bone loading. Conversely, adequate levels of calcium intake can maximize the positive effect of physical activity on bone health during the growth period of children and adolescents. Research also suggests that adequate levels of calcium intake can maximize bone density at the regions being loaded during exercise. Achieving optimal bone health and minimizing one’s risk of osteoporotic fracture later in life depend on a lifelong approach. This approach relies on the establishment of an optimum level of bone during the growth years, with a subsequent goal to maintain and slow the rate of age-related bone loss thereafter. Exercise, adequate nutrition, and optimal hormone levels are the components that influence the bone outcome. Making healthy nutritional choices, engaging in weight-bearing physical activity, and ensuring optimal hormone levels during growth provides a window of opportunity to build optimal bone mass, to reduce the risk of fracture later in life. Concurrent management of fracture risk with a physical activity prescription, adequate nutrition, and pharmacotherapy for osteoporosis when required offers the best approach to optimal bone health throughout adulthood.
Resumo:
Antioxidants in acute physical exercise and exercise training remain a hot topic in sport nutrition, exercise physiology and biology, in general (Jackson, 2008; Margaritis and Rousseau, 2008; Gomez-Cabrera et al., 2012; Nikolaidis et al., 2012). During the past few decades, antioxidants have received attention predominantly as a nutritional strategy for preventing or minimising detrimental effects of reactive oxygen and nitrogen species (RONS), which are generated during and after strenuous exercise (Jackson, 2008, 2009; Powers and Jackson, 2008). Antioxidant supplementation has become a common practice among athletes as a means to (theoretically) reduce oxidative stress, promote recovery and enhance performance (Peternelj and Coombes, 2011). However, until now, requirements of antioxidant micronutrients and antioxidant compounds for athletes training for and competing in different sport events, including marathon running, triathlon races or team sport events involving repeated sprinting, have not been determined sufficiently (Williams et al., 2006; Margaritis and Rousseau, 2008). Crucially, evidence has been emerging that higher dosages of antioxidants may not necessarily be beneficial in this context, but can also elicit detrimental effects by interfering with performance-enhancing (Gomez-Cabrera et al., 2008) and health-promoting training adaptations (Ristow et al., 2009). As originally postulated in a pioneering study on exercise-induced production of RONS by Davies et al. (1982) in the early 1980s, evidence has been increasing in recent years that RONS are not only damaging agents, but also act as signalling molecules for regulating muscle function (Reid, 2001; Jackson, 2008) and for initiating adaptive responses to exercise (Jackson, 2009; Powers et al., 2010). The recognition that antioxidants could, vice versa, interact with the signalling pathways underlying the responses to acute (and repeated) bouts of exercise has contributed important novel aspects to the continued discussion on antioxidant requirements for athletes. In view of the recent advances in this field, it is the aim of this report to examine the current knowledge of antioxidants, in particular of vitamins C and E, in the basic nutrition of athletes. While overviews on related topics including basic mechanisms of exercise-induced oxidative stress, redox biology, antioxidant defence systems and a summary of studies on antioxidant supplementation during exercise training are provided, this does not mean that this report is comprehensive. Several issues of the expanding and multidisciplinary field of antioxidants and exercise are covered elsewhere in this book and/or in the literature. Exemplarily, the reader is referred to reviews on oxidative stress (Konig et al., 2001; Vollaard et al., 2005; Knez et al., 2006; Powers and Jackson, 2008; Nikolaidis et al., 2012), redox-sensitive signalling and muscle function (Reid, 2001; Vollaard et al., 2005; Jackson, 2008; Ji, 2008; Powers and Jackson, 2008; Powers et al., 2010; Radak et al., 2013) and antioxidant supplementation (Williams et al., 2006; Peake et al., 2007; Peternelj and Coombes, 2011) in the context with exercise. Within the scope of the report, we rather aim to address the question regarding requirements of antioxidants, specifically vitamins C and E, during exercise training, draw conclusions and provide practical implications from the recent research.
Resumo:
Parkinson’s disease is a common neurodegenerative disorder with a higher risk of hospitalization than the general population. Therefore, there is a high likelihood of encountering a person with Parkinson’s disease in acute or critical care. Most people with Parkinson’s disease are over the age of 60 years and are likely to have other concurrent medical conditions. Parkinson’s disease is more likely to be the secondary diagnosis during hospital admission. The primary diagnosis may be due to other medical conditions or as a result of complications from Parkinson’s disease symptoms. Symptoms include motor symptoms, such as slowness of movement and tremor, and non-motor symptoms, such as depression, dysphagia, and constipation. There is a large degree of variation in the presence and degree of symptoms as well as in the rate of progression. There is a range of medications that can be used to manage the motor or non-motor symptoms, and side effects can occur. Improper administration of medications can result in deterioration of the patient’s condition and potentially a life-threatening condition called neuroleptic malignant-like syndrome. Nutrients and delayed gastric emptying may also interfere with intestinal absorption of levodopa, the primary medication used for motor symptom management. Rates of protein-energy malnutrition can be up to 15 % in people with Parkinson’s disease in the community, and this is likely to be higher in the acute or critical care setting. Nutrition-related care in this setting should utilize the Nutrition Care Process and take into account each individual’s Parkinson’s disease motor and non-motor symptoms, the severity of disease, limitations due to the disease, medical management regimen, and nutritional status when planning nutrition interventions. Special considerations may need to be taken into account in relation to meal and medication times and the administration of enteral feeding. Nutrition screening, assessment, and monitoring should occur during admission to minimize the effects of Parkinson's disease symptoms and to optimise nutrition-related outcomes.
Resumo:
The Queensland Transport Industry Workplace Health Intervention project was a Participatory Action Research (PAR) project to investigate the effectiveness of workplace-based nutrition and physical activity health promotion interventions for truck drivers in transport industry workplaces in south-east Queensland. The project was conducted by a research team at the Queensland University of Technology (QUT), and was funded by the Queensland Government under the Healthier.Happier.Workplaces initiative.