957 resultados para Ageing, nutrition, hospital
Resumo:
Despite the increasing number of immigrants, there is a limited body of literature describing the use of hospital emergency department (ED) care by immigrants in Australia. This study aims to describe how immigrants from refugee source countries (IRSC) utilise ED care, compared to immigrants from the main English speaking countries (MESC), immigrants from other countries (IOC) and the local population in Queensland. A retrospective analysis of a Queensland state-wide hospital ED dataset (ED Information System) from 1-1-2008 to 31-12-2010 was conducted. Our study showed that immigrants are not a homogenous group. We found that immigrants from IRSC are more likely to use interpreters (8.9%) in the ED compared to IOC. Furthermore, IRSC have a higher rate of ambulance use (odds ratio 1.2, 95% confidence interval (CI) 1.2–1.3), are less likely to be admitted to the hospital from the ED (odds ratio 0.7 (95% CI 0.7–0.8), and have a longer length of stay (LOS; mean differences 33.0, 95% CI 28.8–37.2), in minutes, in the ED compared to the Australian born population. Our findings highlight the need to develop policies and educational interventions to ensure the equitable use of health services among vulnerable immigrant populations.
Resumo:
Background Falls are one of the most frequently occurring adverse events that impact upon the recovery of older hospital inpatients. Falls can threaten both immediate and longer-term health and independence. There is need to identify cost-effective means for preventing falls in hospitals. Hospital-based falls prevention interventions tested in randomized trials have not yet been subjected to economic evaluation. Methods Incremental cost-effectiveness analysis was undertaken from the health service provider perspective, over the period of hospitalization (time horizon) using the Australian Dollar (A$) at 2008 values. Analyses were based on data from a randomized trial among n = 1,206 acute and rehabilitation inpatients. Decision tree modeling with three-way sensitivity analyses were conducted using burden of disease estimates developed from trial data and previous research. The intervention was a multimedia patient education program provided with trained health professional follow-up shown to reduce falls among cognitively intact hospital patients. Results The short-term cost to a health service of one cognitively intact patient being a faller could be as high as A$14,591 (2008). The education program cost A$526 (2008) to prevent one cognitively intact patient becoming a faller and A$294 (2008) to prevent one fall based on primary trial data. These estimates were unstable due to high variability in the hospital costs accrued by individual patients involved in the trial. There was a 52% probability the complete program was both more effective and less costly (from the health service perspective) than providing usual care alone. Decision tree modeling sensitivity analyses identified that when provided in real life contexts, the program would be both more effective in preventing falls among cognitively intact inpatients and cost saving where the proportion of these patients who would otherwise fall under usual care conditions is at least 4.0%. Conclusions This economic evaluation was designed to assist health care providers decide in what circumstances this intervention should be provided. If the proportion of cognitively intact patients falling on a ward under usual care conditions is 4% or greater, then provision of the complete program in addition to usual care will likely both prevent falls and reduce costs for a health service.
Resumo:
Exercise-based cardiac rehabilitation (CR) is efficacious in reducing mortality and hospital admissions; however it remains inaccessible to large proportions of the patient population. Removal of attendance barriers for hospital or centre-based CR has seen the promotion of home-based CR. Delivery of safe and appropriately prescribed exercise in the home was first documented 25 years ago, with the utilisation of fixed land-line telecommunications to monitor ECG. The advent of miniature ECG sensors, in conjunction with smartphones, now enables CR to be delivered with greater flexibility with regard to location, time and format, while retaining the capacity for real-time patient monitoring. A range of new systems allow other signals including speed, location, pulse oximetry, and respiration to be monitored and these may have application in CR. There is compelling evidence that telemonitored-based CR is an effective alternative to traditional CR practice. The long-standing barrier of access to centre-based CR, combined with new delivery platforms, raises the question of when telemonitored-based CR could replace conventional approaches as the standard practice.
Resumo:
Based on a series of interviews of Australians between the ages of 55 and 75 this paper explores the relations between our participants’ attitudes towards and use of communication, social and tangible technologies and three relevant themes from our data: staying active, friends and families, and cultural selves. While common across our participants’ experiences of ageing, these themes were notable for the diverse ways they were experienced and expressed within individual lives and for the different roles technology was used for within each. A brief discussion of how the diversity of our ageing population implicates the design of emerging technologies ends the paper.
Resumo:
This study is the first to employ an epidemiological framework to evaluate the ‘fit-for-purpose’ of ICD-10-AM external cause of injury codes, ambulance and hospital clinical documentation for injury surveillance. Importantly, this thesis develops an evidence-based platform to guide future improvements in routine data collections used to inform the design of effective injury prevention strategies. Quantification of the impact of ambulance clinical records on the overall information quality of Queensland hospital morbidity data collections for injury causal information is a unique and notable contribution of this study.
Resumo:
Generally, the magnitude of pollutant emissions from diesel engines is ultimately coupled to the structure of fuel molecules. The presence of oxygen, level of unsaturation and the carbon chain length of respective molecules influence the combustion chemistry. It is speculated that increased oxygen content in the fuel may lead to the increased oxidative potential (Stevanovic, S. 2013). Also, upon the exposure to UV and ozone in the atmosphere, the chemical composition of the exhaust is changed. The presence of an oxidant and UV is triggering the cascade of photochemical reactions as well as the partitioning of semi-volatile compounds between the gas and particle phase. To gain an insight into the relationship between the molecular structures of the esters, their volatile organic content and the potential toxicity of diesel exhaust particulate matter, measurements were conducted on a modern common rail diesel engine. This research also investigates the contribution of atmospheric conditions on the transfer of semi-volatile fraction of diesel exhaust from the gas phase to the particle phase and the extent to which semi-volatile compounds (SVOCs) are related to the oxidative potential, expressed through the concentration of reactive oxygen species (ROS) (Stevanovic, S. 2013)...
Resumo:
AIM: To document and compare current practice in nutrition assessment of Parkinson’s disease by dietitians in Australia and Canada in order to identify priority areas for review and development of practice guidelines and direct future research. METHODS: An online survey was distributed to DAA members and PEN subscribers through their email newsletters. The survey captured current practice in the phases of the Nutrition Care Plan. The results of the assessment phase are presented here. RESULTS: Eighty-four dietitians responded. Differences in practice existed in the choice of nutrition screening and assessment tools, including appropriate BMI ranges. Nutrition impact symptoms were commonly assessed, but information about Parkinson’s disease medication interactions were not consistently assessed. CONCLUSIONS: he variation in practice related to the use of screening and assessment methods may result in the identification of different goals for subsequent interventions. Even more practice variation was evident for those items more specific to Parkinson’s disease and may be due to the lack of evidence to guide practice. Further research is required to support decisions for nutrition assessment of Parkinson’s disease.
Resumo:
Aim This study aimed to demonstrate how supervisors and students use their time during the three domains of nutrition and dietetic clinical placement and to what extent patient care and non-patient activities change during placement compared to pre- and post- placement. Methods A cohort survey design was used with students from two Queensland universities, and their supervisors in 2010. Participants recorded their time use in either a paper-based or an electronic survey. Supervisors’ and students’ time-use was calculated as independent daily means according to time use categories reported over the length of the placement. Mean daily number of occasions of service, length of occasions of service, project and other time use in minutes was reported as productivity output indicators and the data imputed. A linear mixed modelling approach was used to describe the relationship between the stage of placement and time use in minutes. Results Combined students’ (n= 21) and supervisors’ (n=29) time use as occasions of service or length of occasions of service in patient care activities were significantly different pre, during and post placement. On project-based placements in food service management and community public health nutrition, supervisors’ project activity time significantly decreased during placements with students undertaking more time in project activities. Conclusions This study showed students do not reduce occasions of service in patient care and they enhance project activities in food service and community public health nutrition while on placement. A larger study is required to confirm these results.
Resumo:
The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.
Resumo:
Background: Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods: In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results: The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion: 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.
Resumo:
This paper presents the outcome of a study that investigated the relationships between technology prior experience, self-efficacy, technology anxiety, complexity of interface (nested versus flat) and intuitive use in older people. The findings show that, as expected, older people took less time to complete the task on the interface that used a flat structure when compared to the interface that used a complex nested structure. All age groups also used the flat interface more intuitively. However, contrary to what was hypothesised, older age groups did better under anxious conditions. Interestingly, older participants did not make significantly more errors compared with younger age groups on either interface structures.
Resumo:
As Earth's climate is rapidly changing, the impact of ambient temperature on health outcomes has attracted increasing attention in the recent time. Considerable number of excess deaths has been reported because of exposure to ambient hot and cold temperatures. However, relatively little research has been conducted on the relation between temperature and morbidity. The aim of this study was to characterize the relationship between both hot and cold temperatures and emergency hospital admissions in Brisbane, Australia, and to examine whether the relation varied by age and socioeconomic factors. It aimed to explore lag structures of temperature–morbidity association for respiratory causes, and to estimate the magnitude of emergency hospital admissions for cardiovascular diseases attributable to hot and cold temperatures for the large contribution of both diseases to the total emergency hospital admissions. A time series study design was applied using routinely collected data of daily emergency hospital admissions, weather and air pollution variables in Brisbane during 1996–2005. Poisson regression model with a distributed lag non-linear structure was adopted to assess the impact of temperature on emergency hospital admissions after adjustment for confounding factors. Both hot and cold effects were found, with higher risk of hot temperatures than that of cold temperatures. Increases in mean temperature above 24.2oC were associated with increased morbidity, especially for the elderly ≥ 75 years old with the largest effect. The magnitude of the risk estimates of hot temperature varied by age and socioeconomic factors. High population density, low household income, and unemployment appeared to modify the temperature–morbidity relation. There were different lag structures for hot and cold temperatures, with the acute hot effect within 3 days after hot exposure and about 2-week lagged cold effect on respiratory diseases. A strong harvesting effect after 3 days was evident for respiratory diseases. People suffering from cardiovascular diseases were found to be more vulnerable to hot temperatures than cold temperatures. However, more patients admitted for cardiovascular diseases were attributable to cold temperatures in Brisbane compared with hot temperatures. This study contributes to the knowledge base about the association between temperature and morbidity. It is vitally important in the context of ongoing climate change. The findings of this study may provide useful information for the development and implementation of public health policy and strategic initiatives designed to reduce and prevent the burden of disease due to the impact of climate change.
Resumo:
Background The role of fathers in shaping their child’s eating behaviour and weight status through their involvement in child feeding has rarely been studied. This study aims to describe the fathers’ perceived responsibility for child feeding, and to identify predictors of how frequently fathers eat meals with their child. Methods Four hundred and thirty-six Australian fathers (M age=37 years, SD=6 years; 34% university educated) of a 2-5 year old child (M age=3.5 years, SD=0.9 years; 53% boys) were recruited via contact with mothers enrolled in existing research projects or a University staff and student email list. Data were collected from fathers via a self-report questionnaire. Descriptive and hierarchical linear regression analyses were conducted. Results The majority of fathers reported that the family often/mostly ate meals together (79%). Many fathers perceived that they were responsible at least half of the time for feeding their child in terms of organizing meals (42%); amount offered (50%) and deciding if their child eats the ‘right kind of foods’ (60%). Time spent in paid employment was inversely associated with how frequently fathers ate meals with their child (β=-0.23, p<0.001); however, both higher perceived responsibility for child feeding (β=-0.16, p<0.004) and a more involved and positive attitude toward their role as a father (β=0.20, p<0.001) were positively related to how often they ate meals with their child, adjusting for a range of paternal and child covariates, including time spent in paid employment. Conclusions Fathers from a broad range of educational backgrounds appear willing to participate in research studies on child feeding. Most fathers were engaged and involved in family meals and child feeding. This suggests that fathers, like mothers, should be viewed as potential agents for the implementation of positive feeding practices within the family.
Resumo:
Researchers over the last decade have documented the association between general parenting style and numerous factors related to childhood obesity (e.g., children's eating behaviors, physical activity, and weight status). Many recent childhood obesity prevention programs are family focused and designed to modify parenting behaviors thought to contribute to childhood obesity risk. This article presents a brief consideration of conceptual, methodological, and translational issues that can inform future research on the role of parenting in childhood obesity. They include: (1) General versus domain specific parenting styles and practices; (2) the role of ethnicity and culture; (3) assessing bidirectional influences; (4) broadening assessments beyond the immediate family; (5) novel approaches to parenting measurement, and; (6) designing effective interventions. Numerous directions for future research are offered.