4 resultados para paediatric intensive care
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Background: This study examines perceived stress and its potential causal factors in nurses. Stress has been seen as a routine and accepted part of the healthcare worker’s role. The lack of research on stress in nurses in Ireland motivated this study. Aims: The aims of this study are to examine the level of stress experienced by nurses working in an Irish teaching hospital, and investigate differences in perceived stress levels by ward area and associations with work characteristics. Method: A cross-sectional study design was employed, with a two-stage cluster sampling process. A self-administered questionnaire was used to collect the data and nurses were investigated across ten different wards using the Nursing Stress Scale and the Demand Control Support Scales. Results: The response rate was 62%. Using outpatients as a reference ward, perceived stress levels were found to be significantly higher in the medical ward, accident and emergency, intensive care unit and paediatric wards (p<0.05). There was no significant difference between the wards with regard to job strain, however, differences did occur with levels of support; the day unit and paediatric ward reporting the lowest level of supervisor support (p<0.01). A significant association was seen between the wards and perceived stress even after adjustment (p<0.05). Conclusion: The findings suggest that perceived stress does vary within different work areas in the same hospital. Work factors, such as demand and support are important with regard to perceived stress. Job control was not found to play an important role.
Improving the care of preterm infants: before, during, and after, stabilisation in the delivery room
Resumo:
Introduction Up to 10% of infants require stabilisation during transition to extrauterine life. Enhanced monitoring of cardiorespiratory parameters during this time may improve stabilisation outcomes. In addition, technology may facilitate improved preparation for delivery room stabilisation as well as NICU procedures, through educational techniques. Aim To improve infant care 1) before birth via improved training, 2) during stabilisation via enhanced physiological monitoring and improved practice, and 3) after delivery, in the neonatal intensive care unit (NICU), via improved procedural care. Methods A multifaceted approach was utilised including; a combination of questionnaire based surveys, mannequin-based investigations, prospective observational investigations, and a randomised controlled trial involving preterm infants less than 32 weeks in the delivery room. Forms of technology utilised included; different types of mannequins including a CO2 producing mannequin, qualitative end tidal CO2 (EtCO2) detectors, a bespoke quantitative EtCO2 detector, and annotated videos of infant stabilisation as well as NICU procedures Results Manual ventilation improved with the use of EtCO2 detection, and was positively assessed by trainees. Quantitative EtCO2 detection in the delivery room is feasible, EtCO2 increased over the first 4 minutes of life in preterm infants, and EtCO2 was higher in preterm infants who were intubated. Current methods of heart rate assessment were found to be unreliable. Electrocardiography (ECG) application warrants further evaluation. Perfusion index (PI) monitoring utilised in the delivery room was feasible. Video recording technology was utilised in several ways. This technology has many potential benefits, including debriefing and coaching in procedural healthcare, and warrants further evaluation. Parents would welcome the introduction of webcams in the NICU. Conclusions I have evaluated new methods of improving infant care before, during, and after stabilisation in the DR. Specifically, I have developed novel educational tools to facilitate training, and evaluated EtCO2, PI, and ECG during infant stabilisation. I have identified barriers in using webcams in the NICU, to now be addressed prior to webcam implementation.
Resumo:
The electroencephalogram (EEG) is an important noninvasive tool used in the neonatal intensive care unit (NICU) for the neurologic evaluation of the sick newborn infant. It provides an excellent assessment of at-risk newborns and formulates a prognosis for long-term neurologic outcome.The automated analysis of neonatal EEG data in the NICU can provide valuable information to the clinician facilitating medical intervention. The aim of this thesis is to develop a system for automatic classification of neonatal EEG which can be mainly divided into two parts: (1) classification of neonatal EEG seizure from nonseizure, and (2) classifying neonatal background EEG into several grades based on the severity of the injury using atomic decomposition. Atomic decomposition techniques use redundant time-frequency dictionaries for sparse signal representations or approximations. The first novel contribution of this thesis is the development of a novel time-frequency dictionary coherent with the neonatal EEG seizure states. This dictionary was able to track the time-varying nature of the EEG signal. It was shown that by using atomic decomposition and the proposed novel dictionary, the neonatal EEG transition from nonseizure to seizure states could be detected efficiently. The second novel contribution of this thesis is the development of a neonatal seizure detection algorithm using several time-frequency features from the proposed novel dictionary. It was shown that the time-frequency features obtained from the atoms in the novel dictionary improved the seizure detection accuracy when compared to that obtained from the raw EEG signal. With the assistance of a supervised multiclass SVM classifier and several timefrequency features, several methods to automatically grade EEG were explored. In summary, the novel techniques proposed in this thesis contribute to the application of advanced signal processing techniques for automatic assessment of neonatal EEG recordings.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain