864 resultados para Sensor Data Fusion Applicazioni
Resumo:
Ubiquitous computing (one person, many computers) is the third era in the history of computing. It follows the mainframe era (many people, one computer) and the PC era (one person, one computer). Ubiquitous computing empowers people to communicate with services by interacting with their surroundings. Most of these so called smart environments contain sensors sensing users’ actions and try to predict the users’ intentions and necessities based on sensor data. The main drawback of this approach is that the system might perform unexpected or unwanted actions, making the user feel out of control. In this master thesis we propose a different procedure based on Interactive Spaces: instead of predicting users’ intentions based on sensor data, the system reacts to users’ explicit predefined actions. To that end, we present REACHeS, a server platform which enables communication among services, resources and users located in the same environment. With REACHeS, a user controls services and resources by interacting with everyday life objects and using a mobile phone as a mediator between himself/herself, the system and the environment. REACHeS’ interfaces with a user are built upon NFC (Near Field Communication) technology. NFC tags are attached to objects in the environment. A tag stores commands that are sent to services when a user touches the tag with his/her NFC enabled device. The prototypes and usability tests presented in this thesis show the great potential of NFC to build such user interfaces.
Resumo:
Effective automatic summarization usually requires simulating human reasoning such as abstraction or relevance reasoning. In this paper we describe a solution for this type of reasoning in the particular case of surveillance of the behavior of a dynamic system using sensor data. The paper first presents the approach describing the required type of knowledge with a possible representation. This includes knowledge about the system structure, behavior, interpretation and saliency. Then, the paper shows the inference algorithm to produce a summarization tree based on the exploitation of the physical characteristics of the system. The paper illustrates how the method is used in the context of automatic generation of summaries of behavior in an application for basin surveillance in the presence of river floods.
Resumo:
Dynamic measurements will become a standard for bridge monitoring in the near future. This fact will produce an important cost reduction for maintenance. US Administration has a long term intensive research program in order to diminish the estimated current maintenance cost of US$7 billion per year over 20 years. An optimal intervention maintenance program demands a historical dynamical record, as well as an updated mathematical model of the structure to be monitored. In case that a model of the structure is not actually available it is possible to produce it, however this possibility does not exist for missing measurement records from the past. Current acquisition systems to monitor structures can be made more efficient by introducing the following improvements, under development in the Spanish research Project “Low cost bridge health monitoring by ambient vibration tests using wireless sensors”: (a) a complete wireless system to acquire sensor data, (b) a wireless system that permits the localization and the hardware identification of the whole sensor system. The applied localization system has been object of a recent patent, and (c) automatization of the modal identification process, aimed to diminish human intervention. This system is assembled with cheap components and allows the simultaneous use of a large number of sensors at a low placement cost. The engineer’s intervention is limited to the selection of sensor positions, probably based on a preliminary FE analysis. In case of multiple setups, also the position of a number of fixed reference sensors has to be decided. The wireless localization system will obtain the exact coordinates of all these sensors positions. When the selection of optimal positions is difficult, for example because of the lack of a proper FE model, this can be compensated by using a higher number of measuring (also reference) points. The described low cost acquisition system allows the responsible bridge administration to obtain historical dynamic identification records at reasonable costs that will be used in future maintenance programs. Therefore, due to the importance of the baseline monitoring record of a new bridge, a monitoring test just after its construction should be highly recommended, if not compulsory.
Resumo:
Services in smart environments pursue to increase the quality of people?s lives. The most important issues when developing this kind of environments is testing and validating such services. These tasks usually imply high costs and annoying or unfeasible real-world testing. In such cases, artificial societies may be used to simulate the smart environment (i.e. physical environment, equipment and humans). With this aim, the CHROMUBE methodology guides test engineers when modeling human beings. Such models reproduce behaviors which are highly similar to the real ones. Originally, these models are based on automata whose transitions are governed by random variables. Automaton?s structure and the probability distribution functions of each random variable are determined by a manual test and error process. In this paper, it is presented an alternative extension of this methodology which avoids the said manual process. It is based on learning human behavior patterns automatically from sensor data by using machine learning techniques. The presented approach has been tested on a real scenario, where this extension has given highly accurate human behavior models,
Resumo:
Although context could be exploited to improve performance, elasticity and adaptation in most distributed systems that adopt the publish/subscribe (P/S) communication model, only a few researchers have focused on the area of context-aware matching in P/S systems and have explored its implications in domains with highly dynamic context like wireless sensor networks (WSNs) and IoT-enabled applications. Most adopted P/S models are context agnostic or do not differentiate context from the other application data. In this article, we present a novel context-aware P/S model. SilboPS manages context explicitly, focusing on the minimization of network overhead in domains with recurrent context changes related, for example, to mobile ad hoc networks (MANETs). Our approach represents a solution that helps to efficiently share and use sensor data coming from ubiquitous WSNs across a plethora of applications intent on using these data to build context awareness. Specifically, we empirically demonstrate that decoupling a subscription from the changing context in which it is produced and leveraging contextual scoping in the filtering process notably reduces (un)subscription cost per node, while improving the global performance/throughput of the network of brokers without fltering the cost of SIENA-like topology changes.
Resumo:
With the development of the embedded application and driving assistance systems, it becomes relevant to develop parallel mechanisms in order to check and to diagnose these new systems. In this thesis we focus our research on one of this type of parallel mechanisms and analytical redundancy for fault diagnosis of an automotive suspension system. We have considered a quarter model car passive suspension model and used a parameter estimation, ARX model, method to detect the fault happening in the damper and spring of system. Moreover, afterward we have deployed a neural network classifier to isolate the faults and identifies where the fault is happening. Then in this regard, the safety measurements and redundancies can take into the effect to prevent failure in the system. It is shown that The ARX estimator could quickly detect the fault online using the vertical acceleration and displacement sensor data which are common sensors in nowadays vehicles. Hence, the clear divergence is the ARX response make it easy to deploy a threshold to give alarm to the intelligent system of vehicle and the neural classifier can quickly show the place of fault occurrence.
Resumo:
The quality of water level time series data strongly varies with periods of high and low quality sensor data. In this paper we are presenting the processing steps which were used to generate high quality water level data from water pressure measured at the Time Series Station (TSS) Spiekeroog. The TSS is positioned in a tidal inlet between the islands of Spiekeroog and Langeoog in the East Frisian Wadden Sea (southern North Sea). The processing steps will cover sensor drift, outlier identification, interpolation of data gaps and quality control. A central step is the removal of outliers. For this process an absolute threshold of 0.25m/10min was selected which still keeps the water level increase and decrease during extreme events as shown during the quality control process. A second important feature of data processing is the interpolation of gappy data which is accomplished with a high certainty of generating trustworthy data. Applying these methods a 10 years dataset (December 2002-December 2012) of water level information at the TSS was processed resulting in a seven year time series (2005-2011).
Resumo:
Pressing scientific questions concerning the Greenland ice sheet's climatic sensitivity, hydrology, and contributions to current and future sea level rise require hydrological datasets to resolve. While direct observations of ice sheet meltwater losses can be obtained in terrestrial rivers draining the ice sheet and from lake levels, few such datasets exist. We present a new dataset of meltwater river discharge for the vicinity of Kangerlussuaq, Southwest Greenland. The dataset contains measurements of river stage and discharge for three sites along the Akuliarusiarsuup Kuua (Watson) River's northern tributary, with 30 minute temporal resolution between June 2008 and August 2010. Additional data of water temperature, air pressure, and lake water depth and temperature are also provided. Discharge data were measured at sites with near-ideal properties for such data collection. Regardless, high water bedload and turbulent flow introduce considerable uncertainty. These were constrained and quantified using statistical techniques, thereby providing a high quality dataset from this important site. The greatest data uncertainties are associated with streambed elevation change and measurements. Large portions of stream channels deepened according to statistical tests, but poor precision of streambed depth measurements also added uncertainty. Quality checked data are freely available for scientific use as supplementary online material.
Resumo:
An approach and strategy for automatic detection of buildings from aerial images using combined image analysis and interpretation techniques is described in this paper. It is undertaken in several steps. A dense DSM is obtained by stereo image matching and then the results of multi-band classification, the DSM, and Normalized Difference Vegetation Index (NDVI) are used to reveal preliminary building interest areas. From these areas, a shape modeling algorithm has been used to precisely delineate their boundaries. The Dempster-Shafer data fusion technique is then applied to detect buildings from the combination of three data sources by a statistically-based classification. A number of test areas, which include buildings of different sizes, shape, and roof color have been investigated. The tests are encouraging and demonstrate that all processes in this system are important for effective building detection.
Resumo:
Recovering position from sensor information is an important problem in mobile robotics, known as localisation. Localisation requires a map or some other description of the environment to provide the robot with a context to interpret sensor data. The mobile robot system under discussion is using an artificial neural representation of position. Building a geometrical map of the environment with a single camera and artificial neural networks is difficult. Instead it would be simpler to learn position as a function of the visual input. Usually when learning images, an intermediate representation is employed. An appropriate starting point for biologically plausible image representation is the complex cells of the visual cortex, which have invariance properties that appear useful for localisation. The effectiveness for localisation of two different complex cell models are evaluated. Finally the ability of a simple neural network with single shot learning to recognise these representations and localise a robot is examined.
Resumo:
Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.
Resumo:
Distributive tactile sensing is a method of tactile sensing in which a small number of sensors monitors the behaviour of a flexible substrate which is in contact with the object being sensed. This paper describes the first use of fibre Bragg grating sensors in such a system. Two systems are presented: the first is a one-dimensional metal strip with an array of four sensors, which is capable of detecting the magnitude and position of a contacting load. This system is favourably compared experimentally with a similar system using resistive strain gauges. The second system is a two-dimensional steel plate with nine sensors which is able to distinguish the position and shape of a contacting load, or the positions of two loads simultaneously. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact. Issues and limitations of the systems are discussed, along with proposed solutions to some of the difficulties.
Resumo:
This thesis presents an investigation into the application of methods of uncertain reasoning to the biological classification of river water quality. Existing biological methods for reporting river water quality are critically evaluated, and the adoption of a discrete biological classification scheme advocated. Reasoning methods for managing uncertainty are explained, in which the Bayesian and Dempster-Shafer calculi are cited as primary numerical schemes. Elicitation of qualitative knowledge on benthic invertebrates is described. The specificity of benthic response to changes in water quality leads to the adoption of a sensor model of data interpretation, in which a reference set of taxa provide probabilistic support for the biological classes. The significance of sensor states, including that of absence, is shown. Novel techniques of directly eliciting the required uncertainty measures are presented. Bayesian and Dempster-Shafer calculi were used to combine the evidence provided by the sensors. The performance of these automatic classifiers was compared with the expert's own discrete classification of sampled sites. Variations of sensor data weighting, combination order and belief representation were examined for their effect on classification performance. The behaviour of the calculi under evidential conflict and alternative combination rules was investigated. Small variations in evidential weight and the inclusion of evidence from sensors absent from a sample improved classification performance of Bayesian belief and support for singleton hypotheses. For simple support, inclusion of absent evidence decreased classification rate. The performance of Dempster-Shafer classification using consonant belief functions was comparable to Bayesian and singleton belief. Recommendations are made for further work in biological classification using uncertain reasoning methods, including the combination of multiple-expert opinion, the use of Bayesian networks, and the integration of classification software within a decision support system for water quality assessment.
Resumo:
Two distributive tactile sensing systems are presented, based on fibre Bragg grating sensors. The first is a onedimensional metal strip with an array of 4 sensors, which is capable of detecting the magnitude and position of a contacting load. This system is compared experimentally with a similar system using resistive strain gauges. The second is a two-dimensional steel plate with 9 sensors which is able to distinguish the position and shape of a contacting load. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact.
Resumo:
Distributive tactile sensing is a method of tactile sensing in which a small number of sensors monitors the behaviour of a flexible substrate which is in contact with the object being sensed. This paper describes the first use of fibre Bragg grating sensors in such a system. Two systems are presented: the first is a one-dimensional metal strip with an array of four sensors, which is capable of detecting the magnitude and position of a contacting load. This system is favourably compared experimentally with a similar system using resistive strain gauges. The second system is a two-dimensional steel plate with nine sensors which is able to distinguish the position and shape of a contacting load, or the positions of two loads simultaneously. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact. Issues and limitations of the systems are discussed, along with proposed solutions to some of the difficulties. © 2007 IOP Publishing Ltd.