42 resultados para sensor networks
Resumo:
In this research we focus on the Tyndall 25mm and 10mm nodes energy-aware topology management to extend sensor network lifespan and optimise node power consumption. The two tiered Tyndall Heterogeneous Automated Wireless Sensors (THAWS) tool is used to quickly create and configure application-specific sensor networks. To this end, we propose to implement a distributed route discovery algorithm and a practical energy-aware reaction model on the 25mm nodes. Triggered by the energy-warning events, the miniaturised Tyndall 10mm data collector nodes adaptively and periodically change their association to 25mm base station nodes, while 25mm nodes also change the inter-connections between themselves, which results in reconfiguration of the 25mm nodes tier topology. The distributed routing protocol uses combined weight functions to balance the sensor network traffic. A system level simulation is used to quantify the benefit of the route management framework when compared to other state of the art approaches in terms of the system power-saving.
Resumo:
Embedded wireless sensor network (WSN) systems have been developed and used in a wide variety of applications such as local automatic environmental monitoring; medical applications analysing aspects of fitness and health energy metering and management in the built environment as well as traffic pattern analysis and control applications. While the purpose and functions of embedded wireless sensor networks have a myriad of applications and possibilities in the future, a particular implementation of these ambient sensors is in the area of wearable electronics incorporated into body area networks and everyday garments. Some of these systems will incorporate inertial sensing devices and other physical and physiological sensors with a particular focus on the application areas of athlete performance monitoring and e-health. Some of the important physical requirements for wearable antennas are that they are light-weight, small and robust and should also use materials that are compatible with a standard manufacturing process such as flexible polyimide or fr4 material where low cost consumer market oriented products are being produced. The substrate material is required to be low loss and flexible and often necessitates the use of thin dielectric and metallization layers. This paper describes the development of such a wearable, flexible antenna system for ISM band wearable wireless sensor networks. The material selected for the development of the wearable system in question is DE104i characterized by a dielectric constant of 3.8 and a loss tangent of 0.02. The antenna feed line is a 50 Ohm microstrip topology suitable for use with standard, high-performance and low-cost SMA-type RF connector technologies, widely used for these types of applications. The desired centre frequency is aimed at the 2.4GHz ISM band to be compatible with IEEE 802.15.4 Zigbee communication protocols and the Bluetooth standard which operate in this band.
Resumo:
Though the motivation for developing Ambient Assisted Living (AAL) systems is incontestable, significant challenges exist in realizing the ambience that is essential to the success of such systems. By definition, an AAL system must be omnipresent, tracking occupant activities in the home and identifying those situations where assistance is needed or would be welcomed. Embedded sensors offer an attractive mechanism for realizing ambience as their form factor and harnessing of wireless technologies aid in their seamless integration into pre-existing environments. However, the heterogeneity of the end-user population, their disparate needs and the differing environments in which they inhabit, all pose particular problems regarding sensor integration and management
Resumo:
Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.
Resumo:
This work performs an extensive charterisation of precision targeted throwing in professional and recreational darts. The goal is to identify the contributing factors for lateral drift or throwing inaccuracy in the horizontal plane. A multitechnology approach is adopted whereby a custom built body area network of wireless inertial measurement devices monitor tilt, force and timing, an optical 3D motion capture system provides a complete kinematic model of the subject, electromyography sensors monitor muscle activation patterns and a force plate and pressure mat capture tactile pressure and force measurements. The study introduces the concept of constant throwing rhythm and highlights how landing errors in the horizontal plane can be attributable to a number of variations in arm force and speed, centre of gravity and the movements of some of the bodies non throw related extremities.
Resumo:
This work presents the design and evaluation of the REAM (Remote Electricity Actuation and Monitoring) node based around the modular Tyndall Mote platform. The REAM node enables the user to remotely actuate power to a mains power extension board while sampling the current, voltage, power and power factor of the attached load. The node contains a current transformer interfaced to an Energy Metering IC which continuously samples current and voltage. These values are periodically read from the part by a PIC24 microcontroller, which calculates the RMS current and voltage, power factor and overall power. The resultant values can then be queried wirelessly employing the Tyndall 802.15.4 compliant wireless module.
Resumo:
When miniaturized wireless sensors are placed on or close to the human body, they can experience a significant loss inperformance due to antenna detuning, resulting in degradationof wireless performance as well as decreased battery lifetime.Several antenna tuning technologies have been proposed formobile wireless devices but devices suitable for widespread integration have yet to emerge. This paper highlights the possible advantages of antenna tuning for wearable wireless sensors and presents the design and characterization of a prototype 433MHz tuner module.
Resumo:
Evaluation of temperature distribution in cold rooms is an important consideration in the design of food storage solutions. Two common approaches used in both industry and academia to address this question are the deployment of wireless sensors, and modelling with Computational Fluid Dynamics (CFD). However, for a realworld evaluation of temperature distribution in a cold room, both approaches have their limitations. For wireless sensors, it is economically unfeasible to carry out large-scale deployment (to obtain a high resolution of temperature distribution); while with CFD modelling, it is usually not accurate enough to get a reliable result. In this paper, we propose a model-based framework which combines the wireless sensors technique with CFD modelling technique together to achieve a satisfactory trade-off between minimum number of wireless sensors and the accuracy of temperature profile in cold rooms. A case study is presented to demonstrate the usability of the framework.
Resumo:
In the area of food and pharmacy cold storage, temperature distribution is considered as a key factor. Inappropriate distribution of temperature during the cooling process in cold rooms will cause the deterioration of the quality of products and therefore shorten their life-span. In practice, in order to maintain the distribution of temperature at an appropriate level, large amount of electrical energy has to be consumed to cool down the volume of space, based on the reading of a single temperature sensor placed in every cold room. However, it is not clear and visible that what is the change of energy consumption and temperature distribution over time. It lacks of effective tools to visualise such a phenomenon. In this poster, we initially present a solution which combines a visualisation tool with a Computational Fluid Dynamics (CFD) model together to enable users to explore such phenomenon.
Resumo:
Error correcting codes are combinatorial objects, designed to enable reliable transmission of digital data over noisy channels. They are ubiquitously used in communication, data storage etc. Error correction allows reconstruction of the original data from received word. The classical decoding algorithms are constrained to output just one codeword. However, in the late 50’s researchers proposed a relaxed error correction model for potentially large error rates known as list decoding. The research presented in this thesis focuses on reducing the computational effort and enhancing the efficiency of decoding algorithms for several codes from algorithmic as well as architectural standpoint. The codes in consideration are linear block codes closely related to Reed Solomon (RS) codes. A high speed low complexity algorithm and architecture are presented for encoding and decoding RS codes based on evaluation. The implementation results show that the hardware resources and the total execution time are significantly reduced as compared to the classical decoder. The evaluation based encoding and decoding schemes are modified and extended for shortened RS codes and software implementation shows substantial reduction in memory footprint at the expense of latency. Hermitian codes can be seen as concatenated RS codes and are much longer than RS codes over the same aphabet. A fast, novel and efficient VLSI architecture for Hermitian codes is proposed based on interpolation decoding. The proposed architecture is proven to have better than Kötter’s decoder for high rate codes. The thesis work also explores a method of constructing optimal codes by computing the subfield subcodes of Generalized Toric (GT) codes that is a natural extension of RS codes over several dimensions. The polynomial generators or evaluation polynomials for subfield-subcodes of GT codes are identified based on which dimension and bound for the minimum distance are computed. The algebraic structure for the polynomials evaluating to subfield is used to simplify the list decoding algorithm for BCH codes. Finally, an efficient and novel approach is proposed for exploiting powerful codes having complex decoding but simple encoding scheme (comparable to RS codes) for multihop wireless sensor network (WSN) applications.
Resumo:
Can my immediate physical environment affect how I feel? The instinctive answer to this question must be a resounding “yes”. What might seem a throwaway remark is increasingly borne out by research in environmental and behavioural psychology, and in the more recent discipline of Evidence-Based Design. Research outcomes are beginning to converge with findings in neuroscience and neurophysiology, as we discover more about how the human brain and body functions, and reacts to environmental stimuli. What we see, hear, touch, and sense affects each of us psychologically and, by extension, physically, on a continual basis. The physical characteristics of our daily environment thus have the capacity to profoundly affect all aspects of our functioning, from biological systems to cognitive ability. This has long been understood on an intuitive basis, and utilised on a more conscious basis by architects and other designers. Recent research in evidence-based design, coupled with advances in neurophysiology, confirm what have been previously held as commonalities, but also illuminate an almost frightening potential to do enormous good, or alternatively, terrible harm, by virtue of how we make our everyday surroundings. The thesis adopts a design methodology in its approach to exploring the potential use of wireless sensor networks in environments for elderly people. Vitruvian principles of “commodity, firmness and delight” inform the research process and become embedded in the final design proposals and research conclusions. The issue of person-environment fit becomes a key principle in describing a model of continuously-evolving responsive architecture which makes the individual user its focus, with the intention of promoting wellbeing. The key research questions are: What are the key system characteristics of an adaptive therapeutic single-room environment? How can embedded technologies be utilised to maximise the adaptive and therapeutic aspects of the personal life-space of an elderly person with dementia?.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain