861 resultados para paediatric intensive care
Resumo:
Objective: We explore how accurately and quickly nurses can identify melodic medical equipment alarms when no mnemonics are used, when alarms may overlap, and when concurrent tasks are performed. Background: The international standard IEC 60601-1-8 (International Electrotechnical Commission, 2005) has proposed simple melodies to distinguish seven alarm sources. Previous studies with nonmedical participants reveal poor learning of melodic alarms and persistent confusions between some of them. The effects of domain expertise, concurrent tasks, and alarm overlaps are unknown. Method: Fourteen intensive care and general medical unit nurses learned the melodic alarms without mnemonics in two sessions on separate days. In the second half of Day 2 the nurses identified single alarms or pairs of alarms played in sequential, partially overlapping, or nearly completely overlapping configurations. For half the experimental blocks nurses performed a concurrent mental arithmetic task. Results: Nurses' learning was poor and was no better than the learning of nonnurses in a previous study. Nurses showed the previously noted confusions between alarms. Overlapping alarms were exceptionally difficult to identify. The concurrent task affected response time but not accuracy. Conclusion: Because of a failure of auditory stream segregation, the melodic alarms cannot be discriminated when they overlap. Directives to sequence the sounding of alarms in medical electrical equipment must be strictly adhered to, or the alarms must redesigned to support better auditory streaming. Application: Actual or potential uses of this research include the implementation of IEC 60601-1-8 alarms in medical electrical equipment.
Resumo:
Conventional training methods for nurses involve many physical factors that place limits on potential class sizes. Alternate training methods with lower physical requirements may support larger class sizes, but given the tactile quality of nurse training, are most appropriately applied to supplement the conventional methods. However, where the importance of physical factors are periphery, such alternate training methods can provide an important way to increase upper class-size limits and therefore the rate of trained nurses entering the important role of critical care. A major issue in ICU training is that the trainee can be released into a real-life intensive care scenario with sub optimal preparation and therefore a level of anxiety for the student concerned, and some risk for the management level nurses, as patient safety is paramount. This lack of preparation places a strain on the allocation of human and non-human resources to teaching, as students require greater levels of supervision. Such issues are a concern to ICU management, as they relate to nursing skill development and patient health outcomes, as nursing training is potentially dangerous for patients who are placed in the care of inexperienced staff. As a solution to this problem, we present a prototype ICU handover training environment that has been developed in a socially interactive virtual world. Nurses in training can connect remotely via the Internet to this environment and engage in collaborative ICU handover training classes.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Introduction and objectives Early recognition of deteriorating patients results in better patient outcomes. Modified early warning scores (MEWS) attempt to identify deteriorating patients early so timely interventions can occur thus reducing serious adverse events. We compared frequencies of vital sign recording 24 h post-ICU discharge and 24 h preceding unplanned ICU admission before and after a new observation chart using MEWS and an associated educational programme was implemented into an Australian Tertiary referral hospital in Brisbane. Design Prospective before-and-after intervention study, using a convenience sample of ICU patients who have been discharged to the hospital wards, and in patients with an unplanned ICU admission, during November 2009 (before implementation; n = 69) and February 2010 (after implementation; n = 70). Main outcome measures Any change in a full set or individual vital sign frequency before-and-after the new MEWS observation chart and associated education programme was implemented. A full set of vital signs included Blood pressure (BP), heart rate (HR), temperature (T°), oxygen saturation (SaO2) respiratory rate (RR) and urine output (UO). Results After the MEWS observation chart implementation, we identified a statistically significant increase (210%) in overall frequency of full vital sign set documentation during the first 24 h post-ICU discharge (95% CI 148, 288%, p value <0.001). Frequency of all individual vital sign recordings increased after the MEWS observation chart was implemented. In particular, T° recordings increased by 26% (95% CI 8, 46%, p value = 0.003). An increased frequency of full vital sign set recordings for unplanned ICU admissions were found (44%, 95% CI 2, 102%, p value = 0.035). The only statistically significant improvement in individual vital sign recordings was urine output, demonstrating a 27% increase (95% CI 3, 57%, p value = 0.029). Conclusions The implementation of a new MEWS observation chart plus a supporting educational programme was associated with statistically significant increases in frequency of combined and individual vital sign set recordings during the first 24 h post-ICU discharge. There were no significant changes to frequency of individual vital sign recordings in unplanned admissions to ICU after the MEWS observation chart was implemented, except for urine output. Overall increases in the frequency of full vital sign sets were seen.
Resumo:
Introduction Critical care patients frequently receive blood transfusions. Some reports show an association between aged or stored blood and increased morbidity and mortality, including the development of transfusion-related acute lung injury (TRALI). However, the existence of conflicting data endorses the need for research to either reject this association, or to confirm it and elucidate the underlying mechanisms. Methods Twenty-eight sheep were randomised into two groups, receiving saline or lipopolysaccharide (LPS). Sheep were further randomised to also receive transfusion of pooled and heat-inactivated supernatant from fresh (Day 1) or stored (Day 42) non-leucoreduced human packed red blood cells (PRBC) or an infusion of saline. TRALI was defined by hypoxaemia during or within two hours of transfusion and histological evidence of pulmonary oedema. Regression modelling compared physiology between groups, and to a previous study, using stored platelet concentrates (PLT). Samples of the transfused blood products also underwent cytokine array and biochemical analyses, and their neutrophil priming ability was measured in vitro. Results TRALI did not develop in sheep that first received saline-infusion. In contrast, 80% of sheep that first received LPS-infusion developed TRALI following transfusion with "stored PRBC." The decreased mean arterial pressure and cardiac output as well as increased central venous pressure and body temperature were more severe for TRALI induced by "stored PRBC" than by "stored PLT." Storage-related accumulation of several factors was demonstrated in both "stored PRBC" and "stored PLT", and was associated with increased in vitro neutrophil priming. Concentrations of several factors were higher in the "stored PRBC" than in the "stored PLT," however, there was no difference to neutrophil priming in vitro. Conclusions In this in vivo ovine model, both recipient and blood product factors contributed to the development of TRALI. Sick (LPS infused) sheep rather than healthy (saline infused) sheep predominantly developed TRALI when transfused with supernatant from stored but not fresh PRBC. "Stored PRBC" induced a more severe injury than "stored PLT" and had a different storage lesion profile, suggesting that these outcomes may be associated with storage lesion factors unique to each blood product type. Therefore, the transfusion of fresh rather than stored PRBC may minimise the risk of TRALI.
Resumo:
Clinical information systems have become important tools in contemporary clinical patient care. However, there is a question of whether the current clinical information systems are able to effectively support clinicians in decision making processes. We conducted a survey to identify some of the decision making issues related to the use of existing clinical information systems. The survey was conducted among the end users of the cardiac surgery unit, quality and safety unit, intensive care unit and clinical costing unit at The Prince Charles Hospital (TPCH). Based on the survey results and reviewed literature, it was identified that support from the current information systems for decision-making is limited. Also, survey results showed that the majority of respondents considered lack in data integration to be one of the major issues followed by other issues such as limited access to various databases, lack of time and lack in efficient reporting and analysis tools. Furthermore, respondents pointed out that data quality is an issue and the three major data quality issues being faced are lack of data completeness, lack in consistency and lack in data accuracy. Conclusion: Current clinical information systems support for the decision-making processes in Cardiac Surgery in this institution is limited and this could be addressed by integrating isolated clinical information systems.
Resumo:
Purpose Endotracheal suctioning causes significant lung derecruitment. Closed suction (CS) minimizes lung volume loss during suction, and therefore, volumes are presumed to recover more quickly postsuctioning. Conflicting evidence exists regarding this. We examined the effects of open suction (OS) and CS on lung volume loss during suctioning, and recovery of end-expiratory lung volume (EELV) up to 30 minutes postsuction. Material and Methods Randomized crossover study examining 20 patients postcardiac surgery. CS and OS were performed in random order, 30 minutes apart. Lung impedance was measured during suction, and end-expiratory lung impedance was measured at baseline and postsuctioning using electrical impedance tomography. Oximetry, partial pressure of oxygen in the alveoli/fraction of inspired oxygen ratio and compliance were collected. Results Reductions in lung impedance during suctioning were less for CS than for OS (mean difference, − 905 impedance units; 95% confidence interval [CI], − 1234 to –587; P < .001). However, at all points postsuctioning, EELV recovered more slowly after CS than after OS. There were no statistically significant differences in the other respiratory parameters. Conclusions Closed suctioning minimized lung volume loss during suctioning but, counterintuitively, resulted in slower recovery of EELV postsuction compared with OS. Therefore, the use of CS cannot be assumed to be protective of lung volumes postsuctioning. Consideration should be given to restoring EELV after either suction method via a recruitment maneuver.
Resumo:
Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.
Resumo:
BACKGROUND: Given the expanding scope of extracorporeal membrane oxygenation (ECMO) and its variable impact on drug pharmacokinetics as observed in neonatal studies, it is imperative that the effects of the device on the drugs commonly prescribed in the intensive care unit (ICU) are further investigated. Currently, there are no data to confirm the appropriateness of standard drug dosing in adult patients on ECMO. Ineffective drug regimens in these critically ill patients can seriously worsen patient outcomes. This study was designed to describe the pharmacokinetics of the commonly used antibiotic, analgesic and sedative drugs in adult patients receiving ECMO. METHODS: This is a multi-centre, open-label, descriptive pharmacokinetic (PK) study. Eligible patients will be adults treated with ECMO for severe cardiac and/or respiratory failure at five Intensive Care Units in Australia and New Zealand. Patients will receive the study drugs as part of their routine management. Blood samples will be taken from indwelling catheters to investigate plasma concentrations of several antibiotics (ceftriaxone, meropenem, vancomycin, ciprofloxacin, gentamicin, piperacillin-tazobactum, ticarcillin-clavulunate, linezolid, fluconazole, voriconazole, caspofungin, oseltamivir), sedatives and analgesics (midazolam, morphine, fentanyl, propofol, dexmedetomidine, thiopentone). The PK of each drug will be characterised to determine the variability of PK in these patients and to develop dosing guidelines for prescription during ECMO. DISCUSSION: The evidence-based dosing algorithms generated from this analysis can be evaluated in later clinical studies. This knowledge is vitally important for optimising pharmacotherapy in these most severely ill patients to maximise the opportunity for therapeutic success and minimise the risk of therapeutic failure
Resumo:
Objective: A literature review to examine the incorporation of respiratory assessment into everyday surgical nursing practice; possible barriers to this; and the relationship to patient outcomes. Primary argument: Escalating demands on intensive care beds have led to highly dependent patients being cared for in general surgical ward areas. This change in patient demographics has meant the knowledge and skills required of registered nurses in these areas has expanded exponentially. The literature supported the notion that postoperative monitoring of vital signs should include the fundamental assessment of respiratory rate; depth and rhythm; work of breathing; use of accessory muscles and symmetrical chest movement; as well as auscultation of lung fields using a stethoscope. Early intervention in response to changes in a patient's respiratory health status impacts positively on patient health outcomes. Substantial support exists for the contention that technologically adept nurses who also possess competent respiratory assessment skills make a difference to respiratory care. Conclusions: Sub-clinical respiratory problems have been demonstrated to contribute to adverse events. There is a paucity of research knowledge as to whether respiratory education programs and associated inservice make a difference to nursing clinical practice. Similarly, the implications for associated respiratory educational needs are not well documented, nor has a research base been sufficiently developed to guide nursing practice. Further research has the potential to influence the future role and function of the registered nurse by determining the importance of respiratory education programs on post-operative patient outcomes.
Resumo:
The implementation guide for the surveillance of CLABSI in intensive care units (ICU) was produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care(ACSQHC), and endorsed by the ACSQHC HAI Advisory Committee. State surveillance units, the ACSQHC and the Australian and New Zealand Intensive Care Society (ANZICS) have representatives on the Technical Working Group, and have provided input into this document.
Resumo:
The measurement of ventilation distribution is currently performed using inhaled tracer gases for multiple breath inhalation studies or imaging techniques to quantify spatial gas distribution. Most tracer gases used for these studies have properties different from that of air. The effect of gas density on regional ventilation distribution has not been studied. This study aimed to measure the effect of gas density on regional ventilation distribution. Methods Ventilation distribution was measured in seven rats using electrical impedance tomography (EIT) in supine, prone, left and right lateral positions while being mechanically ventilated with either air, heliox (30% oxygen, 70% helium) or sulfur hexafluoride (20% SF6, 20% oxygen, 60% air). The effect of gas density on regional ventilation distribution was assessed. Results Gas density did not impact on regional ventilation distribution. The non-dependent lung was better ventilated in all four body positions. Gas density had no further impact on regional filling characteristics. The filling characteristics followed an anatomical pattern with the anterior and left lung showing a greater impedance change during the initial phase of the inspiration. Conclusion It was shown that gas density did not impact on convection dependent ventilation distribution in rats measured with EIT.