423 resultados para Bartz, Nicholas
Resumo:
Background Malnutrition is common in patients with advanced epithelial ovarian cancer (EOC), and is associated with impaired quality of life (QoL), longer hospital stay and higher risk of treatment-related adverse events. This phase III multi-centre randomised clinical trial tested early enteral feeding versus standard care on postoperative QoL. Methods From 2009 to 2013, 109 patients requiring surgery for suspected advanced EOC, moderately to severely malnourished were enrolled at five sites across Queensland and randomised to intervention (n = 53) or control (n = 56) groups. Intervention involved intraoperative nasojejunal tube placement and enteral feeding until adequate oral intake could be maintained. Despite being randomised to intervention, 20 patients did not receive feeds (13 did not receive the feeding tube; 7 had it removed early). Control involved postoperative diet as tolerated. QoL was measured at baseline, 6 weeks postoperatively and 30 days after the third cycle of chemotherapy. The primary outcome measure was the difference in QoL between the intervention and the control group. Secondary endpoints included treatment-related adverse event occurrence, length of stay, postoperative services use, and nutritional status. Results Baseline characteristics were comparable between treatment groups. No significant difference in QoL was found between the groups at any time point. There was a trend towards better nutritional status in patients who received the intervention but the differences did not reach statistical significance except for the intention-to-treat analysis at 7 days postoperatively (11.8 intervention vs. 13.8 control, p 0.04). Conclusion Early enteral feeding did not significantly improve patients' QoL compared to standard of care but may improve nutritional status.
Resumo:
Objective: To examine if streamlining a medical research funding application process saved time for applicants. Design: Cross-sectional surveys before and after the streamlining. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant applications in 2012 or 2014. Main outcome measures: Average researcher time spent preparing an application and the total time for all applications in working days. Results: The average time per application increased from 34 working days before streamlining (95% CI 33 to 35) to 38 working days after streamlining (95% CI 37 to 39; mean difference 4 days, bootstrap p value <0.001). The estimated total time spent by all researchers on applications after streamlining was 614 working years, a 67-year increase from before streamlining. Conclusions: Streamlined applications were shorter but took longer to prepare on average. Researchers may be allocating a fixed amount of time to preparing funding applications based on their expected return, or may be increasing their time in response to increased competition. Many potentially productive years of researcher time are still being lost to preparing failed applications.
Resumo:
The cost effectiveness of antimicrobial stewardship (AMS) programmes was reviewed in hospital settings of Organisation for Economic Co-operation and Development (OECD) countries, and limited to adult patient populations. In each of the 36 studies, the type of AMS strategy and the clinical and cost outcomes were evaluated. The main AMS strategy implemented was prospective audit with intervention and feedback (PAIF), followed by the use of rapid technology, including rapid polymerase chain reaction (PCR)-based methods and matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF) technology, for the treatment of bloodstream infections. All but one of the 36 studies reported that AMS resulted in a reduction in pharmacy expenditure. Among 27 studies measuring changes to health outcomes, either no change was reported post-AMS, or the additional benefits achieved from these outcomes were not quantified. Only two studies performed a full economic evaluation: one on a PAIF-based AMS intervention; and the other on use of rapid technology for the selection of appropriate treatment for serious Staphylococcus aureus infections. Both studies found the interventions to be cost effective. AMS programmes achieved a reduction in pharmacy expenditure, but there was a lack of consistency in the reported cost outcomes making it difficult to compare between interventions. A failure to capture complete costs in terms of resource use makes it difficult to determine the true cost of these interventions. There is an urgent need for full economic evaluations that compare relative changes both in clinical and cost outcomes to enable identification of the most cost-effective AMS strategies in hospitals.
Resumo:
- Objective This study examined chronic disease risks and the use of a smartphone activity tracking application during an intervention in Australian truck drivers (April-October 2014). - Methods Forty-four men (mean age=47.5 [SD 9.8] years) completed baseline health measures, and were subsequently offered access to a free wrist-worn activity tracker and smartphone application (Jawbone UP) to monitor step counts and dietary choices during a 20-week intervention. Chronic disease risks were evaluated against guidelines; weekly step count and dietary logs registered by drivers in the application were analysed to evaluate use of the Jawbone UP. - Results Chronic disease risks were high (e.g. 97% high waist circumference [≥94 cm]). Eighteen drivers (41%) did not start the intervention; smartphone technical barriers were the main reason for drop out. Across 20-weeks, drivers who used the Jawbone UP logged step counts for an average of 6 [SD 1] days/week; mean step counts remained consistent across the intervention (weeks 1–4=8,743[SD 2,867] steps/day; weeks 17–20=8,994[SD 3,478] steps/day). The median number of dietary logs significantly decreased from start (17 [IQR 38] logs/weeks) to end of the intervention (0 [IQR 23] logs/week; p<0.01); the median proportion of healthy diet choices relative to total diet choices logged increased across the intervention (weeks 1–4=38[IQR 21]%; weeks 17–20=58[IQR 18]%). - Conclusions Step counts were more successfully monitored than dietary choices in those drivers who used the Jawbone UP. - Implications Smartphone technology facilitated active living and healthy dietary choices, but also prohibited intervention engagement in a number of these high-risk Australian truck drivers.
Resumo:
In the context of an international economic shift from manufacturing to services and the constant expansion of industries towards online services (Sheth and Sharma, 2008), this study is concerned with the design of self-service technologies (SSTs) for online environments. An industry heavily adopting SSTs across a variety of different services is Health and Wellness, where figures show an ever growing number of health and wellness apps being developed, downloaded and abandoned (Kelley, 2014). Little is known about how to enhance people’s engagement with online wellness SSTs to support self-health management and self-efficacy. This literature review argues that service design of wellness SSTs in online contexts can be improved by developing an enhanced understanding from a people perspective and customer experience point of view. Customer value, quality of service, usability, and self-efficacy all play an important role in understanding how to design SSTs for wellness and keep users engaged. There is a need for further study on how people interact and engage with online services in the context of wellness in order to design engaging wellness services.
Resumo:
- Objectives Falls are the most frequent adverse event reported in hospitals. Patient and staff education delivered by trained educators significantly reduced falls and injurious falls in an older rehabilitation population. The purpose of the study was to explore the educators’ perspectives of delivering the education and to conceptualise how the programme worked to prevent falls among older patients who received the education. - Design A qualitative exploratory study. - Methods Data were gathered from three sources: conducting a focus group and an interview (n=10 educators), written educator notes and reflective researcher field notes based on interactions with the educators during the primary study. The educators delivered the programme on eight rehabilitation wards for periods of between 10 and 40 weeks. They provided older patients with individualised education to engage in falls prevention and provided staff with education to support patient actions. Data were thematically analysed and presented using a conceptual framework. - Results Falls prevention education led to mutual understanding between staff and patients which assisted patients to engage in falls prevention behaviours. Mutual understanding was derived from the following observations: the educators perceived that they could facilitate an effective three-way interaction between staff actions, patient actions and the ward environment which led to behaviour change on the wards. This included engaging with staff and patients, and assisting them to reconcile differing perspectives about falls prevention behaviours. - Conclusions Individualised falls prevention education effectively provides patients who receive it with the capability and motivation to develop and undertake behavioural strategies that reduce their falls, if supported by staff and the ward environment.
Resumo:
Background The objective is to estimate the incremental cost-effectiveness of the Australian National Hand Hygiene Inititiave implemented between 2009 and 2012 using healthcare associated Staphylococcus aureus bacteraemia as the outcome. Baseline comparators are the eight existing state and territory hand hygiene programmes. The setting is the Australian public healthcare system and 1,294,656 admissions from the 50 largest Australian hospitals are included. Methods The design is a cost-effectiveness modelling study using a before and after quasi-experimental design. The primary outcome is cost per life year saved from reduced cases of healthcare associated Staphylococcus aureus bacteraemia, with cost estimated by the annual on-going maintenance costs less the costs saved from fewer infections. Data were harvested from existing sources or were collected prospectively and the time horizon for the model was 12 months, 2011–2012. Findings No useable pre-implementation Staphylococcus aureus bacteraemia data were made available from the 11 study hospitals in Victoria or the single hospital in Northern Territory leaving 38 hospitals among six states and territories available for cost-effectiveness analyses. Total annual costs increased by $2,851,475 for a return of 96 years of life giving an incremental cost-effectiveness ratio (ICER) of $29,700 per life year gained. Probabilistic sensitivity analysis revealed a 100% chance the initiative was cost effective in the Australian Capital Territory and Queensland, with ICERs of $1,030 and $8,988 respectively. There was an 81% chance it was cost effective in New South Wales with an ICER of $33,353, a 26% chance for South Australia with an ICER of $64,729 and a 1% chance for Tasmania and Western Australia. The 12 hospitals in Victoria and the Northern Territory incur annual on-going maintenance costs of $1.51M; no information was available to describe cost savings or health benefits. Conclusions The Australian National Hand Hygiene Initiative was cost-effective against an Australian threshold of $42,000 per life year gained. The return on investment varied among the states and territories of Australia.
Resumo:
The Distributed Network Protocol v3.0 (DNP3) is one of the most widely used protocols to control national infrastructure. The move from point-to-point serial connections to Ethernet-based network architectures, allowing for large and complex critical infrastructure networks. However, networks and con- figurations change, thus auditing tools are needed to aid in critical infrastructure network discovery. In this paper we present a series of intrusive techniques used for reconnaissance on DNP3 critical infrastructure. Our algorithms will discover DNP3 outstation slaves along with their DNP3 addresses, their corresponding master, and class object configurations. To validate our presented DNP3 reconnaissance algorithms and demonstrate it’s practicality, we present an implementation of a software tool using a DNP3 plug-in for Scapy. Our implementation validates the utility of our DNP3 reconnaissance technique. Our presented techniques will be useful for penetration testing, vulnerability assessments and DNP3 network discovery.
Resumo:
Head motion (HM) is a well known confound in analyses of functional MRI (fMRI) data. Neuroimaging researchers therefore typically treat HM as a nuisance covariate in their analyses. Even so, it is possible that HM shares a common genetic influence with the trait of interest. Here we investigate the extent to which this relationship is due to shared genetic factors, using HM extracted from resting-state fMRI and maternal and self report measures of Inattention and Hyperactivity-Impulsivity from the Strengths and Weaknesses of ADHD Symptoms and Normal Behaviour (SWAN) scales. Our sample consisted of healthy young adult twins (N = 627 (63% females) including 95 MZ and 144 DZ twin pairs, mean age 22, who had mother-reported SWAN; N = 725 (58% females) including 101 MZ and 156 DZ pairs, mean age 25, with self reported SWAN). This design enabled us to distinguish genetic from environmental factors in the association between head movement and ADHD scales. HM was moderately correlated with maternal reports of Inattention (r = 0.17, p-value = 7.4E-5) and Hyperactivity-Impulsivity (r = 0.16, p-value = 2.9E-4), and these associations were mainly due to pleiotropic genetic factors with genetic correlations [95% CIs] of rg = 0.24 [0.02, 0.43] and rg = 0.23 [0.07, 0.39]. Correlations between self-reports and HM were not significant, due largely to increased measurement error. These results indicate that treating HM as a nuisance covariate in neuroimaging studies of ADHD will likely reduce power to detect between-group effects, as the implicit assumption of independence between HM and Inattention or Hyperactivity-Impulsivity is not warranted. The implications of this finding are problematic for fMRI studies of ADHD, as failing to apply HM correction is known to increase the likelihood of false positives. We discuss two ways to circumvent this problem: censoring the motion contaminated frames of the RS-fMRI scan or explicitly modeling the relationship between HM and Inattention or Hyperactivity-Impulsivity
Resumo:
- Objective To compare health service cost and length of stay between a traditional and an accelerated diagnostic approach to assess acute coronary syndromes (ACS) among patients who presented to the emergency department (ED) of a large tertiary hospital in Australia. - Design, setting and participants This historically controlled study analysed data collected from two independent patient cohorts presenting to the ED with potential ACS. The first cohort of 938 patients was recruited in 2008–2010, and these patients were assessed using the traditional diagnostic approach detailed in the national guideline. The second cohort of 921 patients was recruited in 2011–2013 and was assessed with the accelerated diagnostic approach named the Brisbane protocol. The Brisbane protocol applied early serial troponin testing for patients at 0 and 2 h after presentation to ED, in comparison with 0 and 6 h testing in traditional assessment process. The Brisbane protocol also defined a low-risk group of patients in whom no objective testing was performed. A decision tree model was used to compare the expected cost and length of stay in hospital between two approaches. Probabilistic sensitivity analysis was used to account for model uncertainty. - Results Compared with the traditional diagnostic approach, the Brisbane protocol was associated with reduced expected cost of $1229 (95% CI −$1266 to $5122) and reduced expected length of stay of 26 h (95% CI −14 to 136 h). The Brisbane protocol allowed physicians to discharge a higher proportion of low-risk and intermediate-risk patients from ED within 4 h (72% vs 51%). Results from sensitivity analysis suggested the Brisbane protocol had a high chance of being cost-saving and time-saving. - Conclusions This study provides some evidence of cost savings from a decision to adopt the Brisbane protocol. Benefits would arise for the hospital and for patients and their families.
Resumo:
An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.
Resumo:
Caesarean section is the most common surgical procedure performed on women worldwide and infections following caesarean section occur in approximately 10% of Australian women...