224 resultados para Health Outcomes
Resumo:
Quantifying spatial and/or temporal trends in environmental modelling data requires that measurements be taken at multiple sites. The number of sites and duration of measurement at each site must be balanced against costs of equipment and availability of trained staff. The split panel design comprises short measurement campaigns at multiple locations and continuous monitoring at reference sites [2]. Here we present a modelling approach for a spatio-temporal model of ultrafine particle number concentration (PNC) recorded according to a split panel design. The model describes the temporal trends and background levels at each site. The data were measured as part of the “Ultrafine Particles from Transport Emissions and Child Health” (UPTECH) project which aims to link air quality measurements, child health outcomes and a questionnaire on the child’s history and demographics. The UPTECH project involves measuring aerosol and particle counts and local meteorology at each of 25 primary schools for two weeks and at three long term monitoring stations, and health outcomes for a cohort of students at each school [3].
Resumo:
Airborne particulate matter pollution is of concern for a number of reasons and has been widely recognised as an important risk factor to human health. A number of toxicological and epidemiological studies reported negative health effects on both respiratory and cardiovascular system. Despite the availability of a huge body of research, the underlying toxicological mechanisms by which particles induce adverse health effects are not yet entirely understood. The production of reactive oxygen species (ROS) has been shown to induce oxidative stress, which is proposed as a mechanism for many of the adverse health outcomes associated with exposure to particulate matter (PM). Therefore, it is crucial to introduce a technique that will allow rapid and routine screenings of the oxidative potential of PM.
Resumo:
Recommendations to improve national diabetes-related foot disease (DRFD) care • National data collection on incidence and outcomes of DRFD. • Improved access to care, through the Medicare Benefits Schedule, for people with diabetes who have a current or past foot complication. • Standardised national model for interdisciplinary DRFD care. • National accreditation of interdisciplinary foot clinics and staff. • Subsidies for evidence-based treatments for DRFD, including medical-grade footwear and pressure off-loading devices. • Holistic diabetes care initiatives to “close the gap” on inequities in health outcomes for Aboriginal and Torres Strait Islander peoples.
Resumo:
Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.
Resumo:
Background: Greater research utilisation in cancer nursing practice is needed, in order to provide well-informed and effective nursing care to people affected by cancer. This paper aims to report on the implementation of evidence-based practice in a tertiary cancer centre. Methods: Using a case report design, this paper reports on the use of the Collaborative Model for Evidence Based Practice (CMEBP) in an Australian tertiary cancer centre. The clinical case is the uptake of routine application of chlorhexidine-impregnated sponge dressings for preventing centrally inserted catheter-related bloodstream infections. In this case report, a number of processes that resulted in a service-wide practice change are described. Results: This model was considered a feasible method for successful research utilisation. In this case report, chlorhexidine-impregnated sponge dressings were proposed and implemented in the tertiary cancer centre with an aim of reducing the incidence of centrally inserted catheter-related bloodstream infections and potentially improving patient health outcomes. Conclusion: The CMEBP is feasible and effective for implementing clinical evidence into cancer nursing practice. Cancer nurses and health administrators need to ensure a supportive infrastructure and environment for clinical inquiry and research utilisation exists, in order to enable successful implementation of evidence-based practice in their cancer centres.
Resumo:
Objective: To investigate whether hospital utilisation and health outcomes in Victoria differ between people born in refugee-source countries and those born in Australia. Design and setting: Analysis of a statewide hospital discharge dataset for the 6 financial years from 1 July 1998 to 30 June 2004. Hospital admissions of people born in eight countries for which the majority of entrants to Australia arrived as refugees were included in the analysis. Main outcome measures: Age-standardised rates and rate ratios for: total hospital admissions; emergency admissions; surgical admissions; total days in hospital; discharge at own risk; hospital deaths; admissions due to infectious and parasitic diseases; and admissions due to mental and behavioural disorders. Results: In 2003–04, compared with the Australia-born Victorian population, people born in refugee-source countries had lower rates of surgical admission (rate ratio [RR], 0.85; 95% CI, 0.81–0.88), total days in hospital (RR, 0.74; 95% CI, 0.73–0.75), and admission due to mental and behavioural disorders (RR, 0.70; 95% CI, 0.65–0.76). Over the 6-year period, rates of total days in hospital and rates of admission due to mental and behavioural disorders for people born in refugee-source countries increased towards Australian-born averages, while rates of total admissions, emergency admissions, and admissions due to infectious and parasitic diseases increased above the Australian-born averages. Conclusions: Use of hospital services among people born in refugee-source countries is not higher than that of the Australian-born population and shows a trend towards Australian-born averages. Our findings indicate that the Refugee and Humanitarian Program does not currently place a burden on the Australian hospital system.
Resumo:
Introduction: Delirium is a serious issue associated with high morbidity and mortality in older hospitalised people. Early recognition enables diagnosis and treatment of underlying cause/s, which can lead to improved patient outcomes. However, research shows knowledge and accurate nurse recognition of delirium and is poor and lack of education appears to be a key issue related to this problem. Thus, the purpose of this randomised controlled trial (RCT) was to evaluate, in a sample of registered nurses, the usability and effectiveness of a web-based learning site, designed using constructivist learning principles, to improve acute care nurse knowledge and recognition of delirium. Prior to undertaking the RCT preliminary phases involving; validation of vignettes, video-taping five of the validated vignettes, website development and pilot testing were completed. Methods: The cluster RCT involved consenting registered nurse participants (N = 175) from twelve clinical areas within three acute health care facilities in Queensland, Australia. Data were collected through a variety of measures and instruments. Primary outcomes were improved ability of nurses to recognise delirium using written validated vignettes and improved knowledge of delirium using a delirium knowledge questionnaire. The secondary outcomes were aimed at determining nurse satisfaction and usability of the website. Primary outcome measures were taken at baseline (T1), directly after the intervention (T2) and two months later (T3). The secondary outcomes were measured at T2 by participants in the intervention group. Following baseline data collection remaining participants were assigned to either the intervention (n=75) or control (n=72) group. Participants in the intervention group were given access to the learning intervention while the control group continued to work in their clinical area and at that time, did not receive access to the learning intervention. Data from the primary outcome measures were examined in mixed model analyses. Results: Overall, the effect of the online learning intervention over time comparing the intervention group and the control group were positive. The intervention groups‘ scores were higher and the change over time results were statistically significant [T3 and T1 (t=3.78 p=<0.001) and T2 and T1 baseline (t=5.83 p=<0.001)]. Statistically significant improvements were also seen for delirium recognition when comparing T2 and T1 results (t=2.58 p=0.012) between the control and intervention group but not for changes in delirium recognition scores between the two groups from T3 and T1 (t=1.80 p=0.074). The majority of the participants rated the website highly on the visual, functional and content elements. Additionally, nearly 80% of the participants liked the overall website features and there were self-reported improvements in delirium knowledge and recognition by the registered nurses in the intervention group. Discussion: Findings from this study support the concept that online learning is an effective and satisfying method of information delivery. Embedded within a constructivist learning environment the site produced a high level of satisfaction and usability for the registered nurse end-users. Additionally, the results showed that the website significantly improved delirium knowledge & recognition scores and the improvement in delirium knowledge was retained at a two month follow-up. Given the strong effect of the intervention the online delirium intervention should be utilised as a way of providing information to registered nurses. It is envisaged that this knowledge would lead to improved recognition of delirium as well as improvement in patient outcomes however; translation of this knowledge attainment into clinical practice was outside the scope of this study. A critical next step is demonstrating the effect of the intervention in changing clinical behaviour, and improving patient health outcomes.
Resumo:
In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.
Resumo:
Objective: Malnutrition results in poor health outcomes, and people with Parkinson’s disease may be more at risk of malnutrition. However, the prevalence of malnutrition in Parkinson’s disease is not yet well defined. The aim of this study is to provide an estimate of the extent of malnutrition in community-dwelling people with Parkinson’s disease. Methods: This is a cross-sectional study of people with Parkinson’s disease residing within a 2 hour driving radius of Brisbane, Australia. The Subjective Global Assessment (SGA) and scored Patient Generated Subjective Global Assessment (PG-SGA) were used to assess nutritional status. Body weight, standing or knee height, mid-arm circumference and waist circumference were measured. Results: Nineteen (15%) of the participants were moderately malnourished (SGA-B). The median PG-SGA score of the SGA-B group was 8 (4 – 15), significantly higher than the SGA-A group, U=1860.5,p<.05. The symptoms most influencing intake were loss of appetite, constipation, early satiety and problems swallowing. Conclusions: As with other populations, malnutrition remains under-recognised and undiagnosed in people with Parkinson’s disease. Regular screening of nutritional status in people with Parkinson’s disease by health professionals with whom they have regular contact should occur to identify those who may benefit from further nutrition assessment and intervention.
Resumo:
Aim. A protocol for a new peer-led self-management programme for communitydwelling older people with diabetes in Shanghai, China. Background. The increasing prevalence of type 2 diabetes poses major public health challenges. Appropriate education programmes could help people with diabetes to achieve self-management and better health outcomes. Providing education programmes to the fast growing number of people with diabetes present a real challenge to Chinese healthcare system, which is strained for personnel and funding shortages. Empirical literature and expert opinions suggest that peer education programmes are promising. Design. Quasi-experimental. Methods. This study is a non-equivalent control group design (protocol approved in January, 2008). A total of 190 people, with 95 participants in each group, will be recruited from two different, but similar, communities. The programme, based on Social Cognitive Theory, will consist of basic diabetes instruction and social support and self-efficacy enhancing group activities. Basic diabetes instruction sessions will be delivered by health professionals, whereas social support and self-efficacy enhancing group activities will be led by peer leaders. Outcome variables include: self-efficacy, social support, self-management behaviours, depressive status, quality of life and healthcare utilization, which will be measured at baseline, 4 and 12 weeks. Discussion. This theory-based programme tailored to Chinese patients has potential for improving diabetes self-management and subsequent health outcomes. In addition, the delivery mode, through involvement of peer leaders and existing community networks,is especially promising considering healthcare resource shortage in China.
Resumo:
Summary Background The final phase of a three phase study analysing the implementation and impact of the nurse practitioner role in Australia (the Australian Nurse Practitioner Project or AUSPRAC) was undertaken in 2009, requiring nurse telephone interviewers to gather information about health outcomes directly from patients and their treating nurse practitioners. A team of several registered nurses was recruited and trained as telephone interviewers. The aim of this paper is to report on development and evaluation of the training process for telephone interviewers. Methods The training process involved planning the content and methods to be used in the training session; delivering the session; testing skills and understanding of interviewers post-training; collecting and analysing data to determine the degree to which the training process was successful in meeting objectives and post-training follow-up. All aspects of the training process were informed by established educational principles. Results Interrater reliability between interviewers was high for well-validated sections of the survey instrument resulting in 100% agreement between interviewers. Other sections with unvalidated questions showed lower agreement (between 75% and 90%). Overall the agreement between interviewers was 92%. Each interviewer was also measured against a specifically developed master script or gold standard and for this each interviewer achieved a percentage of correct answers of 94.7% or better. This equated to a Kappa value of 0.92 or better. Conclusion The telephone interviewer training process was very effective and achieved high interrater reliability. We argue that the high reliability was due to the use of well validated instruments and the carefully planned programme based on established educational principles. There is limited published literature on how to successfully operationalise educational principles and tailor them for specific research studies; this report addresses this knowledge gap.
Resumo:
Objectives: This study examines the hypothesis that a past history of heart interventions will moderate the relationship between psychosocial factors (stressful life events, social support, perceived stress, having a current partner, having a past diagnosis of depression or anxiety over the past 3 years, time pressure, education level, and the mental health index) and the presence of chest pain in a sample of older women. Design: Longitudinal survey over a 3-year period. Methods: The sample was taken from a prospective cohort study of 10,432 women initially aged between 70 and 75 years, who were surveyed in 1996 and then again in 1999. Two groups of women were identified: those reporting to have heart disease but no past history of heart interventions (i.e., coronary artery bypass graft/angioplasty) and those reporting to have heart disease with a past history of heart interventions. Results: Binary logistic regression analysis was used to show that for the women with self-reported coronary heart disease but without a past history of heart intervention, feelings of time pressure as well as the number of stressful life events experienced in the 12 months prior to 1996 were independent risk factors for the presence of chest pain, even after accounting for a range of traditional risk factors. In comparison, for the women with self-reported coronary heart disease who did report a past history of heart interventions, a diagnosis of depression in the previous 3 years was the significant independent risk factor for chest pain even after accounting for traditional risk factors. Conclusion: The results indicate that it is important to consider a history of heart interventions as a moderator of the associations between psychosocial variables and the frequency of chest pain in older women. Statement of Contribution: What is already known on this subject? Psychological factors have been shown to be independent predictors of a range of health outcomes in individuals with coronary heart disease, including the presence of chest pain. Most research has been conducted with men or with small samples of women; however, the evidence does suggest that these relationships exist in women as well as in men. What does this study add? Most studies have looked at overall relationships between psychological variables and health outcomes. The few studies that have looked at moderators have mainly examined gender as a moderator. To our knowledge, this is the first published study to examine a history of heart interventions as a moderator of the relationship between psychological variables and the presence of chest pain.
Resumo:
Background: A range of health outcomes at a population level are related to differences in levels of social disadvantage. Understanding the impact of any such differences in palliative care is important. The aim of this study was to assess, by level of socio-economic disadvantage, referral patterns to specialist palliative care and proximity to inpatient services. Methods: All inpatient and community palliative care services nationally were geocoded (using postcode) to one nationally standardised measure of socio-economic deprivation – Socio-Economic Index for Areas (SEIFA; 2006 census data). Referral to palliative care services and characteristics of referrals were described through data collected routinely at clinical encounters. Inpatient location was measured from each person’s home postcode, and stratified by socio-economic disadvantage. Results: This study covered July – December 2009 with data from 10,064 patients. People from the highest SEIFA group (least disadvantaged) were significantly less likely to be referred to a specialist palliative care service, likely to be referred closer to death and to have more episodes of inpatient care for longer time. Physical proximity of a person’s home to inpatient care showed a gradient with increasing distance by decreasing levels of socio-economic advantage. Conclusion: These data suggest that a simple relationship of low socioeconomic status and poor access to a referral-based specialty such as palliative care does not exist. Different patterns of referral and hence different patterns of care emerge.
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.