851 resultados para Pre-hospital care.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: An estimated 285 million people worldwide have diabetes and its prevalence is predicted to increase to 439 million by 2030. For the year 2010, it is estimated that 3.96 million excess deaths in the age group 20-79 years are attributable to diabetes around the world. Self-management is recognised as an integral part of diabetes care. This paper describes the protocol of a randomised controlled trial of an automated interactive telephone system aiming to improve the uptake and maintenance of essential diabetes self-management behaviours. ---------- Methods/Design: A total of 340 individuals with type 2 diabetes will be randomised, either to the routine care arm, or to the intervention arm in which participants receive the Telephone-Linked Care (TLC) Diabetes program in addition to their routine care. The intervention requires the participants to telephone the TLC Diabetes phone system weekly for 6 months. They receive the study handbook and a glucose meter linked to a data uploading device. The TLC system consists of a computer with software designed to provide monitoring, tailored feedback and education on key aspects of diabetes self-management, based on answers voiced or entered during the current or previous conversations. Data collection is conducted at baseline (Time 1), 6-month follow-up (Time 2), and 12-month follow-up (Time 3). The primary outcomes are glycaemic control (HbA1c) and quality of life (Short Form-36 Health Survey version 2). Secondary outcomes include anthropometric measures, blood pressure, blood lipid profile, psychosocial measures as well as measures of diet, physical activity, blood glucose monitoring, foot care and medication taking. Information on utilisation of healthcare services including hospital admissions, medication use and costs is collected. An economic evaluation is also planned.---------- Discussion: Outcomes will provide evidence concerning the efficacy of a telephone-linked care intervention for self-management of diabetes. Furthermore, the study will provide insight into the potential for more widespread uptake of automated telehealth interventions, globally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methicillin-resistant Staphylococcus Aureus (MRSA) is a pathogen that continues to be of major concern in hospitals. We develop models and computational schemes based on observed weekly incidence data to estimate MRSA transmission parameters. We extend the deterministic model of McBryde, Pettitt, and McElwain (2007, Journal of Theoretical Biology 245, 470–481) involving an underlying population of MRSA colonized patients and health-care workers that describes, among other processes, transmission between uncolonized patients and colonized health-care workers and vice versa. We develop new bivariate and trivariate Markov models to include incidence so that estimated transmission rates can be based directly on new colonizations rather than indirectly on prevalence. Imperfect sensitivity of pathogen detection is modeled using a hidden Markov process. The advantages of our approach include (i) a discrete valued assumption for the number of colonized health-care workers, (ii) two transmission parameters can be incorporated into the likelihood, (iii) the likelihood depends on the number of new cases to improve precision of inference, (iv) individual patient records are not required, and (v) the possibility of imperfect detection of colonization is incorporated. We compare our approach with that used by McBryde et al. (2007) based on an approximation that eliminates the health-care workers from the model, uses Markov chain Monte Carlo and individual patient data. We apply these models to MRSA colonization data collected in a small intensive care unit at the Princess Alexandra Hospital, Brisbane, Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: to assess the accuracy of data linkage across the spectrum of emergency care in the absence of a unique patient identifier, and to use the linked data to examine service delivery outcomes in an emergency department setting. Design: automated data linkage and manual data linkage were compared to determine their relative accuracy. Data were extracted from three separate health information systems: ambulance, ED and hospital inpatients, then linked to provide information about the emergency journey of each patient. The linking was done manually through physical review of records and automatically using a data linking tool (Health Data Integration) developed by the CSIRO. Match rate and quality of the linking were compared. Setting: 10, 835 patient presentations to a large, regional teaching hospital ED over a two month period (August-September 2007). Results: comparison of the manual and automated linkage outcomes for each pair of linked datasets demonstrated a sensitivity of between 95% and 99%; a specificity of between 75% and 99%; and a positive predictive value of between 88% and 95%. Conclusions: Our results indicate that automated linking provides a sound basis for health service analysis, even in the absence of a unique patient identifier. The use of an automated linking tool yields accurate data suitable for planning and service delivery purposes and enables the data to be linked regularly to examine service delivery outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Little is known about the risk perceptions and attitudes of healthcare personnel, especially of emergency prehospital medical care personnel, regarding the possibility of an outbreak or epidemic event. Problem: This study was designed to investigate pre-event knowledge and attitudes of a national sample of the emergency prehospital medical care providers in relation to a potential human influenza pandemic, and to determine predictors of these attitudes. Methods: Surveys were distributed to a random, cross-sectional sample of 20% of the Australian emergency prehospital medical care workforce (n = 2,929), stratified by the nine services operating in Australia, as well as by gender and location. The surveys included: (1) demographic information; (2) knowledge of influenza; and (3) attitudes and perceptions related to working during influenza pandemic conditions. Multiple logistic regression models were constructed to identify predictors of pandemic-related risk perceptions. Results: Among the 725 Australian emergency prehospital medical care personnel who responded, 89% were very anxious about working during pandemic conditions, and 85% perceived a high personal risk associated with working in such conditions. In general, respondents demonstrated poor knowledge in relation to avian influenza, influenza generally, and infection transmission methods. Less than 5% of respondents perceived that they had adequate education/training about avian influenza. Logistic regression analyses indicate that, in managing the attitudes and risk perceptions of emergency prehospital medical care staff, particular attention should be directed toward the paid, male workforce (as opposed to volunteers), and on personnel whose relationship partners do not work in the health industry. Conclusions: These results highlight the potentially crucial role of education and training in pandemic preparedness. Organizations that provide emergency prehospital medical care must address this apparent lack of knowledge regarding infection transmission, and procedures for protection and decontamination. Careful management of the perceptions of emergency prehospital medical care personnel during a pandemic is likely to be critical in achieving an effective response to a widespread outbreak of infectious disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Concern about skin cancer is a common reason for people from predominantly fair-skinned populations to present to primary care doctors. Objectives To examine the frequency and body-site distribution of malignant, pre-malignant and benign pigmented skin lesions excised in primary care. Methods This prospective study conducted in Queensland, Australia, included 154 primary care doctors. For all excised or biopsied lesions, doctors recorded the patient's age and sex, body site, level of patient pressure to excise, and the clinical diagnosis. Histological confirmation was obtained through pathology laboratories. Results Of 9650 skin lesions, 57·7% were excised in males and 75·0% excised in patients ≥50years. The most common diagnoses were basal cell carcinoma (BCC) (35·1%) and squamous cell carcinoma (SCC) (19·7%). Compared with the whole body, the highest densities for SCC, BCC and actinic keratoses were observed on chronically sun-exposed areas of the body including the face in males and females, the scalp and ears in males, and the hands in females. The density of BCC was also high on intermittently or rarely exposed body sites. Females, younger patients and patients with melanocytic naevi were significantly more likely to exert moderate/high levels of pressure on the doctor to excise. Conclusions More than half the excised lesions were skin cancer, which mostly occurred on the more chronically sun-exposed areas of the body. Information on the type and body-site distribution of skin lesions can aid in the diagnosis and planned management of skin cancer and other skin lesions commonly presented in primary care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated changes in pre-service teachers’ personal epistemologies as they engaged in an integrated teaching program. Personal epistemology refers to individual beliefs about the nature of knowing and knowledge and has been shown to influence teaching practice. An integrated approach to teaching, based on both an implicit and explicit focus on personal epistemology, was developed by an academic team within a Bachelor of Education (Early Childhood). The teaching program integrated content across four units of study, modelling personal epistemologies implicitly through collaborative reflexive practice. The students were also required to engage in explicit reflections on their personal epistemologies. Quantitative measures of personal epistemology were collected at the beginning and end of the semester using the Epistemological Beliefs Survey (EBS) to assess changes across the teaching period. Results indicated that pre-service teachers’ epistemological beliefs about the integration of knowledge became more sophisticated over the course of the teaching period. Qualitative data included pre-service teachers’ responses to open ended questions and field experience journal reflections about their perceptions of the teaching program and were collected at the end of the semester. These data showed that pre-service teachers held different conceptions about learning as integration, which provided a more nuanced understanding of the EBS data. Understanding pre-service teachers’ epistemological beliefs provides promising directions for teacher preparation and professional enrichment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background There is little scientific evidence to support the usual practice of providing outpatient rehabilitation to patients undergoing total knee replacement surgery (TKR) immediately after discharge from the orthopaedic ward. It is hypothesised that the lack of clinical benefit is due to the low exercise intensity tolerated at this time, with patients still recovering from the effects of major orthopaedic surgery. The aim of the proposed clinical trial is to investigate the clinical and cost effectiveness of a novel rehabilitation strategy, consisting of an initial home exercise programme followed, approximately six weeks later, by higher intensity outpatient exercise classes. Methods/Design In this multicentre randomised controlled trial, 600 patients undergoing primary TKR will be recruited at the orthopaedic pre-admission clinic of 10 large public and private hospitals in Australia. There will be no change to the medical or rehabilitative care usually provided while the participant is admitted to the orthopaedic ward. After TKR, but prior to discharge from the orthopaedic ward, participants will be randomised to either the novel rehabilitation strategy or usual rehabilitative care as provided by the hospital or recommended by the orthopaedic surgeon. Outcomes assessments will be conducted at baseline (pre-admission clinic) and at 6 weeks, 6 months and 12 months following randomisation. The primary outcomes will be self-reported knee pain and physical function. Secondary outcomes include quality of life and objective measures of physical performance. Health economic data (health sector and community service utilisation, loss of productivity) will be recorded prospectively by participants in a patient diary. This patient cohort will also be followed-up annually for five years for knee pain, physical function and the need or actual incidence of further joint replacement surgery. Discussion The results of this pragmatic clinical trial can be directly implemented into clinical practice. If beneficial, the novel rehabilitation strategy of utilising outpatient exercise classes during a later rehabilitation phase would provide a feasible and potentially cost-effective intervention to optimise the physical well-being of the large number of people undergoing TKR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background:  The aims of this study were to determine the documentation of pharmacotherapy optimization goals in the discharge letters of patients with the principal diagnosis of chronic heart failure. Methods:  A retrospective practice audit of 212 patients discharged to the care of their local general practitioner from general medical units of a large tertiary hospital. Details of recommendations regarding ongoing pharmacological and non-pharmacological management were reviewed. The doses of medications on discharge were noted and whether they met current guidelines recommending titration of angiotensin-converting enzyme inhibitors and beta-blockers. Ongoing arrangements for specialist follow up were also reviewed. Results:  The mean age of patients whose letters were reviewed was 78.4 years (standard deviation ± 8.6); 50% were men. Patients had an overall median of six comorbidities and eight regular medications on discharge. Mean length of stay for each admission was 6 days. Discharge letters were posted a median of 4 days after discharge, with 25% not posted at 10 days. No discharge letter was sent in 9.4% (20) of the cases. Only six (2.8%) letters had any recommendations regarding future titration of angiotensin-converting enzyme inhibitors and 6.6% (14) for beta-blockers. Recommendations for future non-pharmacological management, for example, diuretic action plans, regular weight monitoring and exercise plans were not found in the letters in this audit. Conclusion:  Hospital discharge is an opportunity to communicate management plans for treatment optimization effectively, and while this opportunity is spurned, implementation gaps in the management of cardiac failure will probably remain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To determine whether primary care management of chronic heart failure (CHF) differed between rural and urban areas in Australia. Design: A cross-sectional survey stratified by Rural, Remote and Metropolitan Areas (RRMA) classification. The primary source of data was the Cardiac Awareness Survey and Evaluation (CASE) study. Setting: Secondary analysis of data obtained from 341 Australian general practitioners and 23 845 adults aged 60 years or more in 1998. Main outcome measures: CHF determined by criteria recommended by the World Health Organization, diagnostic practices, use of pharmacotherapy, and CHF-related hospital admissions in the 12 months before the study. Results: There was a significantly higher prevalence of CHF among general practice patients in large and small rural towns (16.1%) compared with capital city and metropolitan areas (12.4%) (P < 0.001). Echocardiography was used less often for diagnosis in rural towns compared with metropolitan areas (52.0% v 67.3%, P < 0.001). Rates of specialist referral were also significantly lower in rural towns than in metropolitan areas (59.1% v 69.6%, P < 0.001), as were prescribing rates of angiotensin-converting enzyme inhibitors (51.4% v 60.1%, P < 0.001). There was no geographical variation in prescribing rates of β-blockers (12.6% [rural] v 11.8% [metropolitan], P = 0.32). Overall, few survey participants received recommended “evidence-based practice” diagnosis and management for CHF (metropolitan, 4.6%; rural, 3.9%; and remote areas, 3.7%). Conclusions: This study found a higher prevalence of CHF, and significantly lower use of recommended diagnostic methods and pharmacological treatment among patients in rural areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim  To explore and discuss from recent literature the common factors contributing to nurse job satisfaction in the acute hospital setting. Background  Nursing dissatisfaction is linked to high rates of nurses leaving the profession, poor morale, poor patient outcomes and increased financial expenditure. Understanding factors that contribute to job satisfaction could increase nurse retention. Evaluation  A literature search from January 2004 to March 2009 was conducted using the keywords nursing, (dis)satisfaction, job (dis)satisfaction to identify factors contributing to satisfaction for nurses working in acute hospital settings. Key issues  This review identified 44 factors in three clusters (intra-, inter- and extra-personal). Job satisfaction for nurses in acute hospitals can be influenced by a combination of any or all of these factors. Important factors included coping strategies, autonomy, co-worker interaction, direct patient care, organizational policies, resource adequacy and educational opportunities. Conclusions  Research suggests that job satisfaction is a complex and multifactorial phenomenon. Collaboration between individual nurses, their managers and others is crucial to increase nursing satisfaction with their job. Implications for nursing management  Recognition and regular reviewing by nurse managers of factors that contribute to job satisfaction for nurses working in acute care areas is pivotal to the retention of valued staff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess the cost-effectiveness of screening, isolation and decolonisation strategies in the control of methicillin-resistant Staphylococcus aureus (MRSA) in intensive care units (ICUs). Design: Economic evaluation. Setting: England and Wales. Population: ICU patients. Main outcome measures: Infections, deaths, costs, quality adjusted life years (QALYs), incremental cost-effectiveness ratios for alternative strategies, net monetary benefits (NMBs). Results: All strategies using isolation but not decolonisation improved health outcomes but increased costs. When MRSA prevalence on admission to the ICU was 5% and the willingness to pay per QALY gained was between £20,000 and £30,000, the best such strategy was to isolate only those patients at high risk of carrying MRSA (either pre-emptively or following identification by admission and weekly MRSA screening using chromogenic agar). Universal admission and weekly screening using polymerase chain reaction (PCR)-based MRSA detection coupled with isolation was unlikely to be cost-effective unless prevalence was high (10% colonised with MRSA on admission to the ICU). All decolonisation strategies improved health outcomes and reduced costs. While universal decolonisation (regardless of MRSA status) was the most cost-effective in the short-term, strategies using screening to target MRSA carriers may be preferred due to reduced risk of selecting for resistance. Amongst such targeted strategies, universal admission and weekly PCR screening coupled with decolonisation with nasal mupirocin was the most cost-effective. This finding was robust to ICU size, MRSA admission prevalence, the proportion of patients classified as high-risk, and the precise value of willingness to pay for health benefits. Conclusions: MRSA control strategies that use decolonisation are likely to be cost-saving in an ICU setting provided resistance is lacking, and combining universal PCR-based screening with decolonisation is likely to represent good value for money if untargeted decolonisation is considered unacceptable. In ICUs where decolonisation is not implemented there is insufficient evidence to support universal MRSA screening outside high prevalence settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Communication is integral to effective trauma care provision. This presentation will report on barriers to meaningful information transfer for multi-trauma patients upon discharge from the Emergency Department (ED) to the care areas of Intensive Care Unit, High Dependency Unit, and Perioperative Services. This is an ongoing study at one tertiary level hospital in Queensland. Method: This is a multi-phase, mixed method study. In Phase 1 data were collected about information transfer. This Phase was initially informed by a comprehensive literature review, then via focus groups, chart audit, staff survey and review of national and international trauma forms. Results: The barriers identified related to nursing handover, documented information, time inefficiency, patient complexity and stability and time of transfer. Specifically this included differences in staff expectations and variation in the nursing handover processes, no agreed minimum dataset of information handed over, missing, illegible or difficult to find information in documentation (both medical and nursing), low compliance with some forms used for documentation. Handover of these patients is complex with information coming from many sources, dealing with issues is more difficult for these patients when transferred out of hours. Conclusions and further directions: This study investigated the current communication processes and standards of information transfer to identify barriers and issues. The barriers identified were the structure used for documentation, processes used (e.g. handover), patient acuity and time. This information is informing the development, implementation and evaluation of strategies to ameliorate the issues identified.