7 resultados para Continuous emission monitoring
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Aim: To examine the relationship between electrographic seizures and long-term outcome in neonates with hypoxic-ischemic encephalopathy (HIE). Method: Full-term neonates with HIE born in Cork University Maternity Hospital from 2003 to 2006 (pre-hypothermia era) and 2009 to 2012 (hypothermia era) were included in this observational study. All had early continuous electroencephalography monitoring. All electrographic seizures were annotated. The total seizure burden and hourly seizure burden were calculated. Outcome (normal/abnormal) was assessed at 24 to 48 months in surviving neonates using either the Bayley Scales of Infant and Toddler Development, Third Edition or the Griffiths Mental Development Scales; a diagnosis of cerebral palsy or epilepsy was also considered an abnormal outcome. Results: Continuous electroencephalography was recorded for a median of 57.1 hours (interquartile range 33.5-80.5h) in 47 neonates (31 males, 16 females); 29 out of 47 (62%) had electrographic seizures and 25 out of 47 (53%) had an abnormal outcome. The presence of seizures per se was not associated with abnormal outcome (p=0.126); however, the odds of an abnormal outcome increased over ninefold (odds ratio [OR] 9.56; 95% confidence interval [95% CI] 2.43-37.67) if a neonate had a total seizure burden of more than 40 minutes (p=0.001), and eightfold (OR: 8.00; 95% CI: 2.06-31.07) if a neonate had a maximum hourly seizure burden of more than 13 minutes per hour (p=0.003). Controlling for electrographic HIE grade or treatment with hypothermia did not change the direction of the relationship between seizure burden and outcome. Interpretation: In HIE, a high electrographic seizure burden is significantly associated with abnormal outcome, independent of HIE severity or treatment with hypothermia.
Resumo:
A comprehensive user model, built by monitoring a user's current use of applications, can be an excellent starting point for building adaptive user-centred applications. The BaranC framework monitors all user interaction with a digital device (e.g. smartphone), and also collects all available context data (such as from sensors in the digital device itself, in a smart watch, or in smart appliances) in order to build a full model of user application behaviour. The model built from the collected data, called the UDI (User Digital Imprint), is further augmented by analysis services, for example, a service to produce activity profiles from smartphone sensor data. The enhanced UDI model can then be the basis for building an appropriate adaptive application that is user-centred as it is based on an individual user model. As BaranC supports continuous user monitoring, an application can be dynamically adaptive in real-time to the current context (e.g. time, location or activity). Furthermore, since BaranC is continuously augmenting the user model with more monitored data, over time the user model changes, and the adaptive application can adapt gradually over time to changing user behaviour patterns. BaranC has been implemented as a service-oriented framework where the collection of data for the UDI and all sharing of the UDI data are kept strictly under the user's control. In addition, being service-oriented allows (with the user's permission) its monitoring and analysis services to be easily used by 3rd parties in order to provide 3rd party adaptive assistant services. An example 3rd party service demonstrator, built on top of BaranC, proactively assists a user by dynamic predication, based on the current context, what apps and contacts the user is likely to need. BaranC introduces an innovative user-controlled unified service model of monitoring and use of personal digital activity data in order to provide adaptive user-centred applications. This aims to improve on the current situation where the diversity of adaptive applications results in a proliferation of applications monitoring and using personal data, resulting in a lack of clarity, a dispersal of data, and a diminution of user control.
Resumo:
This article examines some preliminary tests which were performed in order to evaluate the best electrode configuration (width and spacing) for cell culture analyses. Biochips packaged with indium tin oxide (ITO) interdigitated electrodes (IDEs) were used to perform impedance measurements on A549 cells cultured on the surface of the biochip. Several tests were carried out using a 10 mM solution of Sodium Chloride (NaCl), cell medium and the cell culture itself to characterize some of the configurations already fabricated in the facilities at Tyndall National Institute.
Resumo:
Science Foundation Ireland (CSET - Centre for Science, Engineering and Technology, Grant No. 07/CE/11147)
Resumo:
At a time when technological advances are providing new sensor capabilities, novel network capabilities, long-range communications technologies and data interpreting and delivery formats via the World Wide Web, we never before had such opportunities to sense and analyse the environment around us. However, the challenges exist. While measurement and detection of environmental pollutants can be successful under laboratory-controlled conditions, continuous in-situ monitoring remains one of the most challenging aspects of environmental sensing. This paper describes the development and test of a multi-sensor heterogeneous real-time water monitoring system. A multi-sensor system was deployed in the River Lee, County Cork, Ireland to monitor water quality parameters such as pH, temperature, conductivity, turbidity and dissolved oxygen. The R. Lee comprises of a tidal water system that provides an interesting test site to monitor. The multi-sensor system set-up is described and results of the sensor deployment and the various challenges are discussed.
Resumo:
Structural Health Monitoring (SHM) is an integral part of infrastructure maintenance and management systems due to socio-economic, safety and security reasons. The behaviour of a structure under vibration depends on structure characteristics. The change of structure characteristics may suggest the change in system behaviour due to the presence of damage(s) within. Therefore the consistent, output signal guided, and system dependable markers would be convenient tool for the online monitoring, the maintenance, rehabilitation strategies, and optimized decision making policies as required by the engineers, owners, managers, and the users from both safety and serviceability aspects. SHM has a very significant advantage over traditional investigations where tangible and intangible costs of a very high degree are often incurred due to the disruption of service. Additionally, SHM through bridge-vehicle interaction opens up opportunities for continuous tracking of the condition of the structure. Research in this area is still in initial stage and is extremely promising. This PhD focuses on using bridge-vehicle interaction response for SHM of damaged or deteriorating bridges to monitor or assess them under operating conditions. In the present study, a number of damage detection markers have been investigated and proposed in order to identify the existence, location, and the extent of an open crack in the structure. The theoretical and experimental investigation has been conducted on Single Degree of Freedom linear system, simply supported beams. The novel Delay Vector Variance (DVV) methodology has been employed for characterization of structural behaviour by time-domain response analysis. Also, the analysis of responses of actual bridges using DVV method has been for the first time employed for this kind of investigation.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain