348 resultados para hospital discharge
Resumo:
It is important to detect and treat malnutrition in hospital patients so as to improve clinical outcome and reduce hospital stay. The aim of this study was to develop and validate a nutrition screening tool with a simple and quick scoring system for acute hospital patients in Singapore. In this study, 818 newly admitted patients aged above 18 years old were screened using five parameters that contribute to the risk of malnutrition. A dietitian blinded to the nutrition screening score assessed the same patients using the reference standard, Subjective Global Assessment (SGA) within 48 hours. The sensitivity and specificity were established using the Receiver Operator Characteristics (ROC) curve and the best cutoff scores determined. The nutrition parameter with the largest Area Under the ROC Curve (AUC) was chosen as the final screening tool, which was named 3-Minute Nutrition Screening (3-MinNS). The combination of the parameters weight loss, intake and muscle wastage (3-MinNS), gave the largest AUC when compared with SGA. Using 3-MinNS, the best cutoff point to identify malnourished patients is three (sensitivity 86%, specificity 83%). The cutoff score to identify subjects at risk of severe malnutrition is five (sensitivity 93%, specificity 86%). 3-Minute Nutrition Screening is a valid, simple and rapid tool to identify patients at risk of malnutrition in Singapore acute hospital patients. It is able to differentiate patients at risk of moderate malnutrition and severe malnutrition for prioritization and management purposes.
Temperature variation and emergency hospital admissions for stroke in Brisbane, Australia, 1996-2005
Resumo:
Stroke is a leading cause of disability and death. This study evaluated the association between temperature variation and emergency admissions for stroke in Brisbane, Australia. Daily emergency admissions for stroke, meteorologic and air pollution data were obtained for the period of January 1996 to December 2005. The relative risk of emergency admissions for stroke was estimated with a generalized estimating equations (GEE) model. For primary intracerebral hemorrhage (PIH) emergency admissions, the average daily PIH for the group aged < 65 increased by 15% (95% Confidence Interval (CI): 5, 26%) and 12% (95% CI: 2, 22%) for a 1°C increase in daily maximum temperature and minimum temperature in summer, respectively, after controlling for potential confounding effects of humidity and air pollutants. For ischemic stroke (IS) emergency admissions, the average daily IS for the group aged ≥ 65 decreased by 3% (95% CI: -6, 0%) for a 1°C increase in daily maximum temperature in winter after adjustment for confounding factors. Temperature variation was significantly associated with emergency admissions for stroke, and its impact varied with different type of stroke. Health authorities should pay greater attention to possible increasing emergency care for strokes when temperature changes, in both summer and winter.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Costly hospital readmissions among chronic heart failure (CHF) patients are expected to increase dramatically with the ageing population. This study investigated the prognostic ability of depression, anger and anxiety, prospectively, and after adjusting for illness severity, on the number of readmissions to hospital and the total length of stay over one year. Participants comprised 175 inpatients with CHF. Depression, anger, anxiety, and illness severity were measured at baseline. One year later, the number of readmissions and length of stay for each patient were obtained from medical records. Depression and anger play a detrimental role in the health profile of CHF patients.
Resumo:
Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients
Resumo:
Aim: This paper is a report of a study conducted to describe emergency department nurses' understanding and experiences of implementing discharge planning. ---------- Background: Discharge planning in the emergency department is an important issue because of increased healthcare costs and greater emphasis on continuity of care. When executed as a collaborative process involving a multi-disciplinary team with the patient and family, discharge planning provides continuity of care for patients, less demand on hospitals, improvement in community services and in the services of other healthcare organizations. ---------- Method: The qualitative approach of phenomenography was used in this study. Thirty-two emergency department nurses were recruited between July and September 2005. Semi-structured interviews were conducted. ---------- Findings: From interviewees' descriptions of implementing discharge planning, six categories were established: implementing discharge planning as 'getting rid of my patients', completing routines, being involved in patient education, professionally accountable practice, autonomous practice and demonstrating professional emergency department nursing care. The referential meaning of implementing discharge planning 'in the outcome space' was the professional commitment to emergency department provision of effective discharge services. ---------- Conclusion: The results of this research contribute to knowledge of emergency department nurses' experience in the implementation of the discharge planning process. Key requirements for the provision of manageable discharge services both in Taiwan and worldwide highlighted by this study include adequate workloads, sufficient time, clear policies and standards of discharge planning and enhancement of professional commitment.
Resumo:
Disability following a stroke can impose various restrictions on patients’ attempts at participating in life roles. The measurement of social participation, for instance, is important in estimating recovery and assessing quality of care at the community level. Thus, the identification of factors influencing social participation is essential in developing effective measures for promoting the reintegration of stroke survivors into the community. Data were collected from 188 stroke survivors (mean age 71.7 years) 12 months after discharge from a stroke rehabilitation hospital. Of these survivors, 128 (61 %) had suffered a first ever stroke, and 81 (43 %) had a right hemisphere lesion. Most (n = 156, 83 %) were living in their own home, though 32 (17 %) were living in residential care facilities. Path analysis was used to test a hypothesized model of participation restriction which included the direct and indirect effects between social, psychological and physical outcomes and demographic variables. Participation restriction was the dependent variable. Exogenous independent variables were age, functional ability, living arrangement and gender. Endogenous independent variables were depressive symptoms, state self-esteem and social support satisfaction. The path coefficients showed functional ability having the largest direct effect on participation restriction. The results also showed that more depressive symptoms, low state self-esteem, female gender, older age and living in a residential care facility had a direct effect on participation restriction. The explanatory variables accounted for 71% of the variance in explaining participation restriction. Prediction models have empirical and practical applications such as suggesting important factors to be considered in promoting stroke recovery. The findings suggest that interventions offered over the course of rehabilitation should be aimed at improving functional ability and promoting psychological aspects of recovery. These are likely to enhance stroke survivors resume or maximize their social participation so that they may fulfill productive and positive life roles.
Resumo:
Clients with acquired brain injury often demonstrate hypertonicity and decreased function in their upper limbs, requiring appropriate intervention. Splinting is one of the intervention methods that is widely used to address these issues. Literature shows that some clients are not using splints following fabrication. However, there is a paucity of research about the factors that influence clients to use or not use splints. This study aims to investigate these influential factors for clients with upper limb hypertonicity. Two survey tools including therapist and client questionnaires were developed and completed by both therapists and clients. Six therapists and 14 clients participated in this study and completed the relevant questionnaires. The results illustrate that most clients (13 out of 14) were continuing to use their splints four weeks following discharge from hospital. The main goals of choosing splints for both therapists and clients were prevention of contracture and deformity. The most indicated client reasons for adhering to the splint wearing program were therapist-related factors including clients’ trust and reliance on their therapists. Further reasons for clients implementing the recommended splint-wearing program and clinical implications are discussed.
Resumo:
Objective: To explore the specific factors that impact on nursing resources in relation to the ‘unoccupied bed’. Design: Descriptive observational Setting: Multiple wards in single site, tertiary referral hospital Main outcome measure: Identification and classification of tasks related to the unoccupied bed. Results: Our study identified three main areas of nursing work, which centre on the ‘unoccupied bed’: 1) bed preparation for admission; 2) temporary transfer; 3) bed preparation post patient discharge. Conclusion: The unoccupied bed is not resource neutral and may involve considerable nursing time. The time associated with each of the reasons for the bed being unoccupied remains to be quantified.
Resumo:
Background: Ambiguity remains about the effectiveness of wearing surgical face masks. The purpose of this study was to assess the impact on surgical site infections when non-scrubbed operating room staff did not wear surgical face masks. Design: Randomised controlled trial. Participants: Patients undergoing elective or emergency obstetric, gynecological, general, orthopaedic, breast or urological surgery in an Australian tertiary hospital. Intervention: 827 participants were enrolled and complete follow-up data was available for 811 (98.1%) patients. Operating room lists were randomly allocated to a ‘Mask roup’ (all non-scrubbed staff wore a mask) or ‘No Mask group’ (none of the non-scrubbed staff wore masks). Primary end point: Surgical site infection (identified using in-patient surveillance; post discharge follow-up and chart reviews). The patient was followed for up to six weeks. Results: Overall, 83 (10.2%) surgical site infections were recorded; 46/401 (11.5%) in the Masked group and 37/410 (9.0%) in the No Mask group; odds ratio (OR) 0.77 (95% confidence interval (CI) 0.49 to 1.21), p = 0.151. Independent risk factors for surgical site infection included: any pre-operative stay (adjusted odds ratio [aOR], 0.43 (95% CI, 0.20; 0.95), high BMI aOR, 0.38 (95% CI, 0.17; 0.87), and any previous surgical site infection aOR, 0.40 (95% CI, 0.17; 0.89). Conclusion: Surgical site infection rates did not increase when non-scrubbed operating room personnel did not wear a face mask.
Resumo:
BACKGROUND: A number of epidemiological studies have examined the adverse effect of air pollution on mortality and morbidity. Also, several studies have investigated the associations between air pollution and specific-cause diseases including arrhythmia, myocardial infarction, and heart failure. However, little is known about the relationship between air pollution and the onset of hypertension. OBJECTIVE: To explore the risk effect of particulate matter air pollution on the emergency hospital visits (EHVs) for hypertension in Beijing, China. METHODS: We gathered data on daily EHVs for hypertension, fine particulate matter less than 2.5 microm in aerodynamic diameter (PM(2.5)), particulate matter less than 10 microm in aerodynamic diameter (PM(10)), sulfur dioxide, and nitrogen dioxide in Beijing, China during 2007. A time-stratified case-crossover design with distributed lag model was used to evaluate associations between ambient air pollutants and hypertension. Daily mean temperature and relative humidity were controlled in all models. RESULTS: There were 1,491 EHVs for hypertension during the study period. In single pollutant models, an increase in 10 microg/m(3) in PM(2.5) and PM(10) was associated with EHVs for hypertension with odds ratios (overall effect of five days) of 1.084 (95% confidence interval (CI): 1.028, 1.139) and 1.060% (95% CI: 1.020, 1.101), respectively. CONCLUSION: Elevated levels of ambient particulate matters are associated with an increase in EHVs for hypertension in Beijing, China.