898 resultados para Sensor data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing Building/Energy Management Systems (BMS/EMS) fail to convey holistic performance to the building manager. A 20% reduction in energy consumption can be achieved by efficiently operated buildings compared with current practice. However, in the majority of buildings, occupant comfort and energy consumption analysis is primarily restricted by available sensor and meter data. Installation of a continuous monitoring process can significantly improve the building systems’ performance. We present WSN-BMDS, an IP-based wireless sensor network building monitoring and diagnostic system. The main focus of WSN-BMDS is to obtain much higher degree of information about the building operation then current BMSs are able to provide. Our system integrates a heterogeneous set of wireless sensor nodes with IEEE 802.11 backbone routers and the Global Sensor Network (GSN) web server. Sensing data is stored in a database at the back office via UDP protocol and can be access over the Internet using GSN. Through this demonstration, we show that WSN-BMDS provides accurate measurements of air-temperature, air-humidity, light, and energy consumption for particular rooms in our target building. Our interactive graphical user interface provides a user-friendly environment showing live network topology, monitor network statistics, and run-time management actions on the network. We also demonstrate actuation by changing the artificial light level in one of the rooms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a wireless sensor network mote hardware design and implementation are introduced for building deployment application. The core of the mote design is based on the 8 bit AVR microcontroller, Atmega1281 and 2.4 GHz wireless communication chip, CC2420. The module PCB fabrication is using the stackable technology providing powerful configuration capability. Three main layers of size 25 mm2 are structured to form the mote; these are RF, sensor and power layers. The sensors were selected carefully to meet both the building monitoring and design requirements. Beside the sensing capability, actuation and interfacing to external meters/sensors are provided to perform different management control and data recording tasks. Experiments show that the developed mote works effectively in giving stable data acquisition and owns good communication and power performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Science Foundation Ireland (CSET - Centre for Science, Engineering and Technology, grant 07/CE/I1147)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we investigate tennis stroke recognition using a single inertial measuring unit attached to a player’s forearm during a competitive match. This paper evaluates the best approach for stroke detection using either accelerometers, gyroscopes or magnetometers, which are embedded into the inertial measuring unit. This work concludes what is the optimal training data set for stroke classification and proves that classifiers can perform well when tested on players who were not used to train the classifier. This work provides a significant step forward for our overall goal, which is to develop next generation sports coaching tools using both inertial and visual sensors in an instrumented indoor sporting environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Body Sensor Network (BSN) technology is seeing a rapid emergence in application areas such as health, fitness and sports monitoring. Current BSN wireless sensors typically operate on a single frequency band (e.g. utilizing the IEEE 802.15.4 standard that operates at 2.45GHz) employing a single radio transceiver for wireless communications. This allows a simple wireless architecture to be realized with low cost and power consumption. However, network congestion/failure can create potential issues in terms of reliability of data transfer, quality-of-service (QOS) and data throughput for the sensor. These issues can be especially critical in healthcare monitoring applications where data availability and integrity is crucial. The addition of more than one radio has the potential to address some of the above issues. For example, multi-radio implementations can allow access to more than one network, providing increased coverage and data processing as well as improved interoperability between networks. A small number of multi-radio wireless sensor solutions exist at present but require the use of more than one radio transceiver devices to achieve multi-band operation. This paper presents the design of a novel prototype multi-radio hardware platform that uses a single radio transceiver. The proposed design allows multi-band operation in the 433/868MHz ISM bands and this, together with its low complexity and small form factor, make it suitable for a wide range of BSN applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wireless sensor network can become partitioned due to node failure, requiring the deployment of additional relay nodes in order to restore network connectivity. This introduces an optimisation problem involving a tradeoff between the number of additional nodes that are required and the costs of moving through the sensor field for the purpose of node placement. This tradeoff is application-dependent, influenced for example by the relative urgency of network restoration. In addition, minimising the number of relay nodes might lead to long routing paths to the sink, which may cause problems of data latency. This data latency is extremely important in wireless sensor network applications such as battlefield surveillance, intrusion detection, disaster rescue, highway traffic coordination, etc. where they must not violate the real-time constraints. Therefore, we also consider the problem of deploying multiple sinks in order to improve the network performance. Previous research has only parts of this problem in isolation, and has not properly considered the problems of moving through a constrained environment or discovering changes to that environment during the repair or network quality after the restoration. In this thesis, we firstly consider a base problem in which we assume the exploration tasks have already been completed, and so our aim is to optimise our use of resources in the static fully observed problem. In the real world, we would not know the radio and physical environments after damage, and this creates a dynamic problem where damage must be discovered. Therefore, we extend to the dynamic problem in which the network repair problem considers both exploration and restoration. We then add a hop-count constraint for network quality in which the desired locations can talk to a sink within a hop count limit after the network is restored. For each new problem of the network repair, we have proposed different solutions (heuristics and/or complete algorithms) which prioritise different objectives. We evaluate our solutions based on simulation, assessing the quality of solutions (node cost, movement cost, computation time, and total restoration time) by varying the problem types and the capability of the agent that makes the repair. We show that the relative importance of the objectives influences the choice of algorithm, and different speeds of movement for the repairing agent have a significant impact on performance, and must be taken into account when selecting the algorithm. In particular, the node-based approaches are the best in the node cost, and the path-based approaches are the best in the mobility cost. For the total restoration time, the node-based approaches are the best with a fast moving agent while the path-based approaches are the best with a slow moving agent. For a medium speed moving agent, the total restoration time of the node-based approaches and that of the path-based approaches are almost balanced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Noise is one of the main factors degrading the quality of original multichannel remote sensing data and its presence influences classification efficiency, object detection, etc. Thus, pre-filtering is often used to remove noise and improve the solving of final tasks of multichannel remote sensing. Recent studies indicate that a classical model of additive noise is not adequate enough for images formed by modern multichannel sensors operating in visible and infrared bands. However, this fact is often ignored by researchers designing noise removal methods and algorithms. Because of this, we focus on the classification of multichannel remote sensing images in the case of signal-dependent noise present in component images. Three approaches to filtering of multichannel images for the considered noise model are analysed, all based on discrete cosine transform in blocks. The study is carried out not only in terms of conventional efficiency metrics used in filtering (MSE) but also in terms of multichannel data classification accuracy (probability of correct classification, confusion matrix). The proposed classification system combines the pre-processing stage where a DCT-based filter processes the blocks of the multichannel remote sensing image and the classification stage. Two modern classifiers are employed, radial basis function neural network and support vector machines. Simulations are carried out for three-channel image of Landsat TM sensor. Different cases of learning are considered: using noise-free samples of the test multichannel image, the noisy multichannel image and the pre-filtered one. It is shown that the use of the pre-filtered image for training produces better classification in comparison to the case of learning for the noisy image. It is demonstrated that the best results for both groups of quantitative criteria are provided if a proposed 3D discrete cosine transform filter equipped by variance stabilizing transform is applied. The classification results obtained for data pre-filtered in different ways are in agreement for both considered classifiers. Comparison of classifier performance is carried out as well. The radial basis neural network classifier is less sensitive to noise in original images, but after pre-filtering the performance of both classifiers is approximately the same.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coccolithophores are the primary oceanic phytoplankton responsible for the production of calcium carbonate (CaCO3). These climatically important plankton play a key role in the oceanic carbon cycle as a major contributor of carbon to the open ocean carbonate pump (similar to 50 %) and their calcification can affect the atmosphere-to-ocean (air-sea) uptake of carbon dioxide (CO2) through increasing the seawater partial pressure of CO2 (pCO(2)). Here we document variations in the areal extent of surface blooms of the globally important coccolithophore, Emiliania huxleyi, in the North Atlantic over a 10-year period (1998-2007), using Earth observation data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS). We calculate the annual mean sea surface areal coverage of E. huxleyi in the North Atlantic to be 474 000 +/- 104 000 km(2), which results in a net CaCO3 carbon (CaCO3-C) production of 0.14-1.71 Tg CaCO3-C per year. However, this surface coverage (and, thus, net production) can fluctuate inter-annually by -54/+81% about the mean value and is strongly correlated with the El Nino/Southern Oscillation (ENSO) climate oscillation index (r = 0.75, p < 0.02). Our analysis evaluates the spatial extent over which the E. huxleyi blooms in the North Atlantic can increase the pCO(2) and, thus, decrease the localised air-sea flux of atmospheric CO2. In regions where the blooms are prevalent, the average reduction in the monthly air-sea CO2 flux can reach 55%. The maximum reduction of the monthly air-sea CO2 flux in the time series is 155 %. This work suggests that the high variability, frequency and distribution of these calcifying plankton and their impact on pCO(2) should be considered if we are to fully understand the variability of the North Atlantic air-to-sea flux of CO2. We estimate that these blooms can reduce the annual N. Atlantic net sink atmospheric CO2 by between 3-28 %.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coccolithophores are the primary oceanic phytoplankton responsible for the production of calcium carbonate (CaCO3). These climatically important plankton play a key role in the oceanic carbon cycle as a major contributor of carbon to the open ocean 5 carbonate pump (�50%) and their formation can affect the atmosphere-to-ocean (airsea) uptake of carbon dioxide (CO2) through increasing the seawater partial pressure of CO2 (pCO2). Here we document variations in the areal extent of surface blooms of the globally important coccolithophore, Emiliania huxleyi, in the North Atlantic over a 10-year period (1998–2007), using Earth observation data from the Sea-viewing Wide 10 Field of view Sensor (SeaWiFS).We calculate the annual mean surface areal coverage of E. huxleyi in the North Atlantic to be 474 000±119 000km2 yr−1, which results in a net CaCO3 production of 0.62±0.15 Tg CaCO3 carbon per year. However, this surface coverage and net production can fluctuate by −54/+81% about these mean values and are strongly correlated with the El Ni˜no/Southern Oscillation (ENSO) climate os15 cillation index (r =0.75, p<0.02). Our analysis evaluates the spatial extent over which the E. huxleyi blooms in the North Atlantic can increase the pCO2 and thus decrease the localised sink of atmospheric CO2. In regions where the blooms are prevalent, the average reduction in the monthly CO2 sink can reach 12 %. The maximum reduction of the monthly CO2 sink in the time series is 32 %. This work suggests that the high 20 variability, frequency and distribution of these calcifying plankton and their impact on pCO2 should be considered within modelling studies of the North Atlantic if we are to fully understand the variability of its air-to-sea CO2 flux.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel techniques have been developed for increasing the value of cloud-affected sequences of Advanced Very High Resolution Radiometer (AVHRR) sea-surface temperature (SST) data and Sea-viewing Wide Field-of-view Sensor (SeaWiFS) ocean colour data for visualising dynamic physical and biological oceanic processes such as fronts, eddies and blooms. The proposed composite front map approach is to combine the location, strength and persistence of all fronts observed over several days into a single map, which allows intuitive interpretation of mesoscale structures. This method achieves a synoptic view without blurring dynamic features, an inherent problem with conventional time-averaging compositing methods. Objective validation confirms a significant improvement in feature visibility on composite maps compared to individual front maps. A further novel aspect is the automated detection of ocean colour fronts, correctly locating 96% of chlorophyll fronts in a test data set. A sizeable data set of 13,000 AVHRR and 1200 SeaWiFS scenes automatically processed using this technique is applied to the study of dynamic processes off the Iberian Peninsula such as mesoscale eddy generation, and many additional applications are identified. Front map animations provide a unique insight into the evolution of upwelling and eddies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean Virtual Laboratory is an ESA-funded project to prototype the concept of a single point of access for all satellite remote-sensing data with ancillary model output and in situ measurements for a given region. The idea is to provide easy access for the non-specialist to both data and state-of-the-art processing techniques and enable their easy analysis and display. The project, led by OceanDataLab, is being trialled in the region of the Agulhas Current, as it contains signals of strong contrast (due to very energetic upper ocean dynamics) and special SAR data acquisitions have been recorded there. The project also encourages the take up of Earth Observation data by developing training material to help those not in large scientific or governmental organizations make the best use of what data are available. The website for access is: http://ovl-project.oceandatalab.com/