5 resultados para Climate monitoring and alerting
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
In this paper, a wireless sensor network mote hardware design and implementation are introduced for building deployment application. The core of the mote design is based on the 8 bit AVR microcontroller, Atmega1281 and 2.4 GHz wireless communication chip, CC2420. The module PCB fabrication is using the stackable technology providing powerful configuration capability. Three main layers of size 25 mm2 are structured to form the mote; these are RF, sensor and power layers. The sensors were selected carefully to meet both the building monitoring and design requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks. Experiments show that the developed mote works effectively in giving stable data acquisition and owns good communication and power performance.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
Predicting the evolution of a coastal cell requires the identification of the key drivers of morphology. Soft coastlines are naturally dynamic but severe storm events and even human intervention can accelerate any changes that are occurring. However, when erosive events such as barrier breaching occur with no obvious contributory factors, a deeper understanding of the underlying coastal processes is required. Ideally conclusions on morphological drivers should be drawn from field data collection and remote sensing over a long period of time. Unfortunately, when the Rossbeigh barrier beach in Dingle Bay, County Kerry, began to erode rapidly in the early 2000’s, eventually leading to it breaching in 2008, no such baseline data existed. This thesis presents a study of the morphodynamic evolution of the Inner Dingle Bay coastal system. The study combines existing coastal zone analysis approaches with experimental field data collection techniques and a novel approach to long term morphodynamic modelling to predict the evolution of the barrier beach inlet system. A conceptual model describing the long term evolution of Inner Dingle Bay in 5 stages post breaching was developed. The dominant coastal processes driving the evolution of the coastal system were identified and quantified. A new methodology of long term process based numerical modelling approach to coastal evolution was developed. This method was used to predict over 20 years of coastal evolution in Inner Dingle Bay. On a broader context this thesis utilised several experimental coastal zone data collection and analysis methods such as ocean radar and grain size trend analysis. These were applied during the study and their suitability to a dynamic coastal system was assessed.
Resumo:
An overview is given of a user interaction monitoring and analysis framework called BaranC. Monitoring and analysing human-digital interaction is an essential part of developing a user model as the basis for investigating user experience. The primary human-digital interaction, such as on a laptop or smartphone, is best understood and modelled in the wider context of the user and their environment. The BaranC framework provides monitoring and analysis capabilities that not only records all user interaction with a digital device (e.g. smartphone), but also collects all available context data (such as from sensors in the digital device itself, a fitness band or a smart appliances). The data collected by BaranC is recorded as a User Digital Imprint (UDI) which is, in effect, the user model and provides the basis for data analysis. BaranC provides functionality that is useful for user experience studies, user interface design evaluation, and providing user assistance services. An important concern for personal data is privacy, and the framework gives the user full control over the monitoring, storing and sharing of their data.