866 resultados para Nutritional Epidemiology
Resumo:
Background Children are particularly vulnerable to the effects of extreme temperatures. Objective To examine the relationship between extreme temperatures and paediatric emergency department admissions (EDAs) in Brisbane, Australia, during 2003–2009. Methods A quasi-Poisson generalised linear model combined with a distributed lag non-linear model was used to examine the relationships between extreme temperatures and age-, gender- and cause-specific paediatric EDAs, while controlling for air pollution, relative humidity, day of the week, influenza epidemics, public holiday, season and long-term trends. The model residuals were checked to identify whether there was an added effect due to heat waves or cold spells. Results There were 131 249 EDAs among children during the study period. Both high (RR=1.27; 95% CI 1.12 to 1.44) and low (RR=1.81; 95% CI 1.66 to 1.97) temperatures were significantly associated with an increase in paediatric EDAs in Brisbane. Male children were more vulnerable to temperature effects. Children aged 0–4 years were more vulnerable to heat effects and children aged 10–14 years were more sensitive to both hot and cold effects. High temperatures had a significant impact on several paediatric diseases, including intestinal infectious diseases, respiratory diseases, endocrine, nutritional and metabolic diseases, nervous system diseases and chronic lower respiratory diseases. Low temperatures were significantly associated with intestinal infectious diseases, respiratory diseases and endocrine, nutritional and metabolic diseases. An added effect of heat waves on childhood chronic lower respiratory diseases was seen, but no added effect of cold spells was found. Conclusions As climate change continues, children are at particular risk of a variety of diseases which might be triggered by extremely high temperatures. This study suggests that preventing the effects of extreme temperature on children with respiratory diseases might reduce the number of EDAs.
Resumo:
Purpose Paper-based nutrition screening tools can be challenging to implement in the ambulatory oncology setting. The aim of this study was to determine the validity of the Malnutrition Screening Tool (MST) and a novel, automated nutrition screening system compared to a ‘gold standard’ full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA). Methods An observational, cross-sectional study was conducted in an outpatient oncology day treatment unit (ODTU) within an Australian tertiary health service. Eligibility criteria were as follows: ≥18 years, receiving outpatient anticancer treatment and English literate. Patients self-administered the MST. A dietitian assessed nutritional status using the PGSGA, blinded to the MST score. Automated screening system data were extracted from an electronic oncology prescribing system. This system used weight loss over 3 to 6 weeks prior to the most recent weight record or age-categorised body mass index (BMI) to identify nutritional risk. Sensitivity and specificity against PG-SGA (malnutrition) were calculated using contingency tables and receiver operating curves. Results There were a total of 300 oncology outpatients (51.7 % male, 58.6±13.3 years). The area under the curve (AUC) for weight loss alone was 0.69 with a cut-off value of ≥1 % weight loss yielding 63 % sensitivity and 76.7 % specificity. MST (score ≥2) resulted in 70.6 % sensitivity and 69.5 % specificity, AUC 0.77. Conclusions Both the MST and the automated method fell short of the accepted professional standard for sensitivity (~≥80 %) derived from the PG-SGA. Further investigation into other automated nutrition screening options and the most appropriate parameters available electronically is warranted to support targeted service provision.
Nutritional influences over the life course on lean body mass of individuals in developing countries
Resumo:
The double burden of childhood undernutrition and adult-onset adiposity in transitioning societies poses a significant public health challenge. The development of suboptimal lean body mass (LBM) could partly explain the link between these two forms of malnutrition. This review examines the evidence on both the role of nutrition in “developmental programming” of LBM and the nutritional influences that affect LBM throughout the life course. Studies from developing countries assessing the relationship of early nutrition with later LBM provide important insights. Overall, the evidence is consistent in suggesting a positive association of early nutritional status (indicated by birth weight and growth during first 2 years) with LBM in later life. Evidence on the impact of maternal nutritional supplementation during pregnancy on later LBM is inconsistent. In addition, the role of nutrients (protein, zinc, calcium, vitamin D) that can affect LBM throughout the life course is described. Promoting optimal intakes of these important nutrients throughout the life course is important for reducing childhood undernutrition as well as for improving the LBM of adults.
Resumo:
Background The learning and teaching of epidemiology is core to many public health programs. Many students find the content of epidemiology, and specifically risk of bias assessment, challenging to learn. Howbeit, learning is enhanced when knowledge is able to be acquired from an active-learning, hands-on experience. Methods The innovative use of wireless audience response technology “clickers” was incorporated into the lectures of the university’s post-graduate epidemiology units and the tailored epidemiological modules delivered for professional disciplines (e.g. optometry). Clickers were used to apply several pedagogical approaches of active learning including peer-instruction and real-world simulation. Students were also assessed for their gain in knowledge within the lecture (pre-post) and their perceptions of how the use of clickers helped them learn. The routine university-wide end of semester Insight Survey provided further information of the student’s satisfaction with the approach. Results The technology was useful in identifying deficits of knowledge of key concepts either before or after instruction. Where key concepts were re-tested post-lecture, as expected, knowledge increased significantly and provided immediate feed-back to students. Across the lecture series, typically 85% of students identified the technology helped them learn, increased their opportunity to interact with the lecturer, and recommend their use for future classes. The Insight Survey report identified 93% of respondents identified the unit in which clickers were consistently used provided good learning opportunities. Numerous student comments supported the teaching method. Conclusions Epidemiological subject matter lends itself to incorporation of audience response technology. The use of the technology to facilitate interactive voting provides an instant response and participation of everyone to enhance the classroom experience. The pedagogical approach increases students’ knowledge and increases their satisfaction with the unit.
Resumo:
Background: International epidemic clones (ribotypes 027 and 078) of Clostridium difficile have been associated with death, toxic megacolon and other adverse outcomes in North America and Europe. In 2010, the first local transmission of an epidemic strain (027) of C. difficile was reported in the state of Victoria, Australia, but no cases of infection with this strain were reported in the state of Queensland. In 2012, a prevalence study was undertaken in all public and selected private hospitals to examine the epidemiology of CDI and determine the prevalence of epidemic C. difficile strains in Queensland. Methods: Enhanced surveillance was undertaken on all hospital identified CDI cases aged over 2 years between 10 April and 15 June 2012. Where available, patient samples were cultured and isolates of C. difficile ribotyped. The toxin profile of each isolate was determined by PCR. Results: In total, 168 cases of CDI were identified during the study period. A majority (58.3%) of cases had onset of symptoms in hospital. Of the 62 patients with community onset of symptoms, most (74%) had a hospital admission in the previous 3 months. Only 4 of 168 patients had onset of symptoms within a residential care facility. Thirteen out of the 168 (7.7%) patients included in the study had severe disease (ICU admission and/or death within 30 days of onset). Overall 136/168 (81%) of cases had been prescribed antibiotics in the last month. Of concern was the emergence of a novel ribotype (244) which has recently been described in other parts of Australia and is genetically related to ribotype 027. Seven patients were infected with C. difficile ribotype 244 (8% of 83 samples ribotyped), including one patient requiring ICU admission and one patient who died. Ribotype 244 was tcdA, tcdB and CDT positive and contained a tcdC mutation at position 117. Conclusion: Ongoing surveillance is required to determine the origin and epidemiology of C. difficile ribotype 244 infections in Australia.
Resumo:
Background Few studies have examined acute injuries in track and field in both elite and sub-elite athletes. Purpose To observe the absolute and relative rates of injury in track and field athletes across a wide range of competition levels and ages during three years of the Penn Relays Carnival to assist with future medical coverage planning and injury prevention strategies. Study design: Descriptive epidemiology study. Methods Over a 3-year period all injuries treated by the medical staff were recorded on a standardised injury report form. Absolute injury rates (absolute number of injuries) and relative injury rates (number of injuries per 1000 participants) were determined and odds ratios (OR) of injury rates were calculated between sexes, competition levels and events. Injuries were also broken down into major or minor medical or orthopedic injuries. Results Throughout the study period 48,473 competing athletes participated in the Penn Relays Carnival, and 436 injuries were sustained. For medical coverage purposes, the relative rate of injury subtypes was greatest for minor orthopedic injuries (5.71 injuries per 1000 participants), followed by minor medical injuries (3.42 injuries per 1000 participants), major medical injuries (0.69 injuries per 1000 participants) and major orthopedic injuries (0.18 injuries per 1000 participants). College/elite level athletes displayed the lowest relative injury rate (7.99 injuries per 1000 participants), which was significantly less than high school (9.87 injuries per 1000 participants) and masters level athletes (16.33 injuries per 1000 participants). Males displayed a greater likelihood of suffering a minor orthopedic injury compared to females (OR = 1.36, 95% CI = 1.06 to 1.75; χ2 = 5.73, p = 0.017) but were less likely to sustain a major medical injury (OR = 0.33, 95% CI = 0.15 to 0.75; χ2 = 7.75, p = 0.005). Of the three most heavily participated in events, the 4 x 400m relay displayed the greatest relative injury rate (13.6 injuries per 1000 participants) compared to the 4 x 100 and 4 x 200m relay. Conclusions Medical coverage teams for future large scale track and field events need to plan for at least two major orthopedic and seven major medical injuries per 1000 participants. Male track and field athletes, particularly masters level male athletes, are at greater risk of injury compared to other genders and competition levels.
Resumo:
Aim Low prevalence rates of malnutrition at 2.5% to 4% have previously been reported in two tertiary paediatric Australian hospitals. The current study is the first to measure the prevalence of malnutrition, obesity and nutritional risk of paediatric inpatients in multiple hospitals throughout Australia. Methods Malnutrition, obesity and nutritional risk prevalence were investigated in 832 and 570 paediatric inpatients, respectively, in eight tertiary paediatric hospitals and eight regional hospitals across Australia on a single day. Malnutrition and obesity prevalence was determined using z-scores and body mass index (BMI) percentiles. High nutritional risk was determined as a Paediatric Yorkhill Malnutrition Score of 2 or more. Results The prevalence rates of malnourished, wasted, stunted, overweight and obese paediatric patients were 15%, 13.8%, 11.9%, 8.8% and 9.9%, respectively. Patients who identified as Aboriginal and Torres Strait Islander were more likely to have lower height-for-age z-scores (P < 0.01); however, BMI and weight-for-age z-scores were not significantly different. Children who were younger, from regional hospitals or with a primary diagnosis of cardiac disease or cystic fibrosis had significantly lower anthropometric z-scores (P = 0.05). Forty-four per cent of patients were identified as at high nutritional risk and requiring further nutritional assessment. Conclusions The prevalence of malnutrition and nutritional risk of Australian paediatric inpatients on a given day was much higher when compared with the healthy population. In contrast, the proportion of overweight and obese patients was less.
Resumo:
Background: Quality of life is poorer in Parkinson’s disease than in other conditions and in the general population without Parkinson’s disease. Malnutrition also results in poorer quality of life. This study aimed at determining the relationship between quality of life and nutritional status. Methods: Community-dwelling people with Parkinson’s disease >18 years old were recruited. The Patient-Generated Subjective Global Assessment (PG-SGA) assessed nutritional status. The Parkinson’s Disease Questionnaire 39 (PDQ-39) measured quality of life. Phase I was cross-sectional. The malnourished in Phase I were eligible for a nutrition intervention phase, randomised into 2 groups: standard care (SC) with provision of nutrition education materials only and intervention (INT) with individualised dietetic advice and regular weekly follow-up. Data were collected at baseline, 6 weeks, and 12 weeks. Results: Phase I consisted of 120 people who completed the PDQ-39. Phase II consisted of 9 in the SC group and 10 in the INT group. In Phase I, quality of life was poorer in the malnourished, particularly for mobility and activities of daily living domains. There was a significant correlation between PG-SGA and PDQ-39 scores (Phase I, rs = 0.445, p = .000; Phase II, rs = .426, p = .002). In Phase II, no significant difference in the PDQ-39 total or sub-scores was observed between the INT and SC groups; however, there was significant improvement in the emotional well-being domain for the entire group, X2(2) = 8.84, p = .012. Conclusions: Malnourished people with Parkinson’s disease had poorer quality of life than the well-nourished, and improvements in nutritional status resulted in quality of life improvements. Attention to nutritional status is an important component of quality of life and therefore the total care of people with Parkinson’s disease.
Resumo:
Rationale Nutritional support is effective in managing malnutrition in COPD (Collins et al., 2012) leading to functional improvements (Collins et al., 2013). However, comparative trials of first line interventions are lacking. This randomised trial compared the effectiveness of individualised dietary advice by a dietitian (DA) versus oral nutritional supplements (ONS). Methods A target sample of 200 stable COPD outpatients at risk of malnutrition (‘MUST’; medium + high risk) were randomised to either a 12-week intervention of ONS (ONS: ~400 kcal/d, ~40 g/d protein) or DA with supportive written advice. The primary outcome was quality of life (QoL) measured using St George’s Respiratory Questionnaire with secondary outcomes including handgrip strength, body weight and nutritional intake. Both the change from baseline and the differences between groups was analysed using SPSS version 20. Results 84 outpatients were recruited (ONS: 41 vs. DA: 43), 72 completed the intervention (ONS: 33 vs. DA: 39). Mean BMI was 18.2 SD 1.6 kg/m2, age 72.6 SD 10 years, FEV1% predicted 36 SD 15% (severe COPD). In comparison to the DA group, the ONS group experienced significantly greater improvements in protein intakes above baseline values at both week 6 (+21.0 SEM 4.3 g/d vs. +0.52 SEM 4.3 g/d; p < 0.001) and week 12 (+19.0 SEM 5.0 g/d vs. +1.0 SEM 3.6 g/d; p = 0.033;ANOVA). QoL and secondary outcomes remained stable at 12 weeks in both groups with slight improvements in the ONS group but no differences between groups. Conclusion In outpatients at risk of malnutrition with severe COPD, nutritional support involving either ONS or DA appears to maintain in tritional status, functional capacity and QoL. However, larger trials, and earlier, multi-modal nutritional interventions for an extended duration should be explored.
Resumo:
The evidence for nutritional support in COPD is almost entirely based on oral nutritional supplements (ONS) yet despite this dietary counseling and food fortification (DA) are often used as the first line treatment for malnutrition. This study aimed to investigate the effectiveness of ONS vs. DA in improving nutritional intake in malnourished outpatients with COPD. 70 outpatients (BMI 18.4 SD 1.6 kg/m2, age 73 SD 9 years, severe COPD) were randomised to receive a 12-week intervention of either ONS or DA (n 33 ONS vs. n 37 DA). Paired t-test analysis revealed total energy intakes significantly increased with ONS at week 6 (+302 SD 537 kcal/d; p = 0.002), with a slight reduction at week 12 (+243 SD 718 kcal/d; p = 0.061) returning to baseline levels on stopping supplementation. DA resulted in small increases in energy that only reached significance 3 months post-intervention (week 6: +48 SD 623 kcal/d, p = 0.640; week 12: +157 SD 637 kcal/d, p = 0.139; week 26: +247 SD 592 kcal/d, p = 0.032). Protein intake was significantly higher in the ONS group at both week 6 and 12 (ONS: +19.0 SD 25.0 g/d vs. DA: +1.0 SD 13.0 g/d; p = 0.033 ANOVA) but no differences were found at week 26. Vitamin C, Iron and Zinc intakes significantly increased only in the ONS group. ONS significantly increased energy, protein and several micronutrient intakes in malnourished COPD patients but only during the period of supplementation. Trials investigating the effects of combined nutritional interventions are required.
Resumo:
Aim Estimate the prevalence of cannabis dependence and its contribution to the global burden of disease. Methods Systematic reviews of epidemiological data on cannabis dependence (1990-2008) were conducted in line with PRISMA and meta-analysis of Observational Studies in Epidemiology (MOOSE) guidelines. Culling and data extraction followed protocols, with cross-checking and consistency checks. DisMod-MR, the latest version of generic disease modelling system, redesigned as a Bayesian meta-regression tool, imputed prevalence by age, year and sex for 187 countries and 21 regions. The disability weight associated with cannabis dependence was estimated through population surveys and multiplied by prevalence data to calculate the years of life lived with disability (YLDs) and disability-adjusted life years (DALYs). YLDs and DALYs attributed to regular cannabis use as a risk factor for schizophrenia were also estimated. Results There were an estimated 13.1 million cannabis dependent people globally in 2010 (point prevalence0.19% (95% uncertainty: 0.17-0.21%)). Prevalence peaked between 20-24 yrs, was higher in males (0.23% (0.2-0.27%)) than females (0.14% (0.12-0.16%)) and in high income regions. Cannabis dependence accounted for 2 million DALYs globally (0.08%; 0.05-0.12%) in 2010; a 22% increase in crude DALYs since 1990 largely due to population growth. Countries with statistically higher age-standardised DALY rates included the United States, Canada, Australia, New Zealand and Western European countries such as the United Kingdom; those with lower DALY rates were from Sub-Saharan Africa-West and Latin America. Regular cannabis use as a risk factor for schizophrenia accounted for an estimated 7,000 DALYs globally. Conclusion Cannabis dependence is a disorder primarily experienced by young adults, especially in higher income countries. It has not been shown to increase mortality as opioid and other forms of illicit drug dependence do. Our estimates suggest that cannabis use as a risk factor for schizophrenia is not a major contributor to population-level disease burden.
Resumo:
Head and neck cancers are some of the leading cancers in the coloured and black South African male population and the perception exists that the incidence rates are rising. Aims: To determine the standardised morbidity rates and some of the risk factors for oral cancer in South Africa. Methods: Using histologically verified data from the National Cancer Registry, the age standardised incidence rates (ASIR) and life-time risks (LR) of oral cancer in South Africa were calculated for 1988-1991.2. In an ongoing case control study (1995 +) among black patients in Johannesburg/Soweto, adjusted odds ratios for developing oral cancers in relation to tobacco and alcohol consumption were calculated. Results: Coloured males vs. females: ASIR 13.13 vs. 3.5 (/100,000/year), LR 1:65 vs. 1:244. Black males vs. females: ASIR 9.06 vs. 1.75, LR 1:86 and 1:455. White males vs. females: ASIR 8.06 vs. 3.18, LR 1:104 vs. 1:278. Asian males vs. females: ASIR 5.24 vs. 6.66, LR 1:161 vs. 1:125. The odds ratio for oral cancer in black males in relation to smoking was 7.0 (95% CI 3.0-14.6) and daily alcohol consumption 1.3 (95% CI 0.6-2.8). In black females the odds ratios in relation to smoking were 3.9 (95% CI 1.7 8.9) and daily alcohol consumption 1.7(95% CI 0.7-4.1). Conclusions: The risk factors for oral cancer in South Africa are multiple and gender discrepancies in ASIR and LR signal differences in exposure to carcinogens. It is unclear whether the incidence of oral cancers will rise in the future.
Resumo:
ABSTRACT Background: The majority of people with dementia live at home until quite late in the disease trajectory, supported by family caregivers who typically take increasing responsibility for providing nutrition. Caregiving is highly stressful and thus both dyad partners are at risk of nutritional issues. Objective: This study evaluated the nutritional status of both dyad members and the associations between these. Design Descriptive, correlational Setting Community Participants 26 dyads of persons with dementia and caregivers Measurements: The nutritional status of each dyad member was evaluated at home using a comprehensive battery of measures including the Mini-Nutritional Assessment, Corrected Arm Muscle Area and a 3-day food diary. Stage of dementia and functional eating capacity was measured for the person with dementia. Caregivers completed a brief burden scale. Results: Of those with dementia (n = 26), a large proportion had nutritional issues (one was malnourished and another 16 were at risk). Six of the caregivers were at risk of malnutrition. In addition, fifteen of the people with dementia did not meet their recommended daily energy requirements. A moderate and significant positive correlation between functional eating skills and nutritional status (MNA score) among participants with dementia was found (r =.523, n = 26, p.006). Conclusion: These findings suggest that a dyadic perspective of nutritional status provides important insights into risk in this vulnerable group. Specifically, monitoring of the functional eating independence skills of the person with dementia is critical, along with assisting caregivers to be aware of their own eating patterns and intake.