13 resultados para Sensor data
em Aston University Research Archive
Resumo:
For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
Magnetoencephalography (MEG) is a non-invasive brain imaging technique with the potential for very high temporal and spatial resolution of neuronal activity. The main stumbling block for the technique has been that the estimation of a neuronal current distribution, based on sensor data outside the head, is an inverse problem with an infinity of possible solutions. Many inversion techniques exist, all using different a-priori assumptions in order to reduce the number of possible solutions. Although all techniques can be thoroughly tested in simulation, implicit in the simulations are the experimenter's own assumptions about realistic brain function. To date, the only way to test the validity of inversions based on real MEG data has been through direct surgical validation, or through comparison with invasive primate data. In this work, we constructed a null hypothesis that the reconstruction of neuronal activity contains no information on the distribution of the cortical grey matter. To test this, we repeatedly compared rotated sections of grey matter with a beamformer estimate of neuronal activity to generate a distribution of mutual information values. The significance of the comparison between the un-rotated anatomical information and the electrical estimate was subsequently assessed against this distribution. We found that there was significant (P < 0.05) anatomical information contained in the beamformer images across a number of frequency bands. Based on the limited data presented here, we can say that the assumptions behind the beamformer algorithm are not unreasonable for the visual-motor task investigated.
Resumo:
Distributive tactile sensing is a method of tactile sensing in which a small number of sensors monitors the behaviour of a flexible substrate which is in contact with the object being sensed. This paper describes the first use of fibre Bragg grating sensors in such a system. Two systems are presented: the first is a one-dimensional metal strip with an array of four sensors, which is capable of detecting the magnitude and position of a contacting load. This system is favourably compared experimentally with a similar system using resistive strain gauges. The second system is a two-dimensional steel plate with nine sensors which is able to distinguish the position and shape of a contacting load, or the positions of two loads simultaneously. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact. Issues and limitations of the systems are discussed, along with proposed solutions to some of the difficulties.
Resumo:
This thesis presents an investigation into the application of methods of uncertain reasoning to the biological classification of river water quality. Existing biological methods for reporting river water quality are critically evaluated, and the adoption of a discrete biological classification scheme advocated. Reasoning methods for managing uncertainty are explained, in which the Bayesian and Dempster-Shafer calculi are cited as primary numerical schemes. Elicitation of qualitative knowledge on benthic invertebrates is described. The specificity of benthic response to changes in water quality leads to the adoption of a sensor model of data interpretation, in which a reference set of taxa provide probabilistic support for the biological classes. The significance of sensor states, including that of absence, is shown. Novel techniques of directly eliciting the required uncertainty measures are presented. Bayesian and Dempster-Shafer calculi were used to combine the evidence provided by the sensors. The performance of these automatic classifiers was compared with the expert's own discrete classification of sampled sites. Variations of sensor data weighting, combination order and belief representation were examined for their effect on classification performance. The behaviour of the calculi under evidential conflict and alternative combination rules was investigated. Small variations in evidential weight and the inclusion of evidence from sensors absent from a sample improved classification performance of Bayesian belief and support for singleton hypotheses. For simple support, inclusion of absent evidence decreased classification rate. The performance of Dempster-Shafer classification using consonant belief functions was comparable to Bayesian and singleton belief. Recommendations are made for further work in biological classification using uncertain reasoning methods, including the combination of multiple-expert opinion, the use of Bayesian networks, and the integration of classification software within a decision support system for water quality assessment.
Resumo:
Two distributive tactile sensing systems are presented, based on fibre Bragg grating sensors. The first is a onedimensional metal strip with an array of 4 sensors, which is capable of detecting the magnitude and position of a contacting load. This system is compared experimentally with a similar system using resistive strain gauges. The second is a two-dimensional steel plate with 9 sensors which is able to distinguish the position and shape of a contacting load. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact.
Resumo:
Distributive tactile sensing is a method of tactile sensing in which a small number of sensors monitors the behaviour of a flexible substrate which is in contact with the object being sensed. This paper describes the first use of fibre Bragg grating sensors in such a system. Two systems are presented: the first is a one-dimensional metal strip with an array of four sensors, which is capable of detecting the magnitude and position of a contacting load. This system is favourably compared experimentally with a similar system using resistive strain gauges. The second system is a two-dimensional steel plate with nine sensors which is able to distinguish the position and shape of a contacting load, or the positions of two loads simultaneously. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact. Issues and limitations of the systems are discussed, along with proposed solutions to some of the difficulties. © 2007 IOP Publishing Ltd.
Resumo:
Two distributive tactile sensing systems are presented, based on fibre Bragg grating sensors. The first is a one-dimensional metal strip with an array of 4 sensors, which is capable of detecting the magnitude and position of a contacting load. This system is compared experimentally with a similar system using resistive strain gauges. The second is a two-dimensional steel plate with 9 sensors which is able to distinguish the position and shape of a contacting load. This system is compared with a similar system using 16 infrared displacement sensors. Each system uses neural networks to process the sensor data to give information concerning the type of contact.
Resumo:
Background: The Unified Huntington’s Disease Rating Scale (UHDRS) is the principal means of assessing motor impairment in Huntington disease but is subjective and generally limited to in-clinic assessments. Objective: To evaluate the feasibility and ability of wearable sensors to measure motor impairment in individuals with Huntington disease in the clinic and at home. Methods: Participants with Huntington disease and controls were asked to wear five accelerometer-based sensors attached to the chest and each limb for standardized, in-clinic assessments and for one day at home. A secondchest sensor was worn for six additional days at home. Gait measures were compared between controls, participants with Huntington disease, and participants with Huntington disease grouped by UHDRS total motor score using Cohen’s d values. Results: Fifteen individuals with Huntington disease and five controls completed the study. Sensor data were successfully captured from 18 of the 20 participants at home. In the clinic, the standard deviation of step time (timebetween consecutive steps) was increased in Huntington disease (p<0.0001; Cohen’s d=2.61) compared to controls. At home with additional observations, significant differences were observed in seven additional gait measures. The gait of individuals with higher total motor scores (50 or more) differed significantly from those with lower total motor scores (below 50) on multiple measures at home. Conclusions: In this pilot study, the use of wearable sensors in clinic and at home was feasible and demonstrated gait differences between controls, participants with Huntington disease, and participants with Huntington diseasegrouped by motor impairment.
Resumo:
The main objective of the project is to enhance the already effective health-monitoring system (HUMS) for helicopters by analysing structural vibrations to recognise different flight conditions directly from sensor information. The goal of this paper is to develop a new method to select those sensors and frequency bands that are best for detecting changes in flight conditions. We projected frequency information to a 2-dimensional space in order to visualise flight-condition transitions using the Generative Topographic Mapping (GTM) and a variant which supports simultaneous feature selection. We created an objective measure of the separation between different flight conditions in the visualisation space by calculating the Kullback-Leibler (KL) divergence between Gaussian mixture models (GMMs) fitted to each class: the higher the KL-divergence, the better the interclass separation. To find the optimal combination of sensors, they were considered in pairs, triples and groups of four sensors. The sensor triples provided the best result in terms of KL-divergence. We also found that the use of a variational training algorithm for the GMMs gave more reliable results.
Resumo:
Energy consumption has been a key concern of data gathering in wireless sensor networks. Previous research works show that modulation scaling is an efficient technique to reduce energy consumption. However, such technique will also impact on both packet delivery latency and packet loss, therefore, may result in adverse effects on the qualities of applications. In this paper, we study the problem of modulation scaling and energy-optimization. A mathematical model is proposed to analyze the impact of modulation scaling on the overall energy consumption, end-to-end mean delivery latency and mean packet loss rate. A centralized optimal management mechanism is developed based on the model, which adaptively adjusts the modulation levels to minimize energy consumption while ensuring the QoS for data gathering. Experimental results show that the management mechanism saves significant energy in all the investigated scenarios. Some valuable results are also observed in the experiments. © 2004 IEEE.
Resumo:
In wireless sensor networks where nodes are powered by batteries, it is critical to prolong the network lifetime by minimizing the energy consumption of each node. In this paper, the cooperative multiple-input-multiple-output (MIMO) and data-aggregation techniques are jointly adopted to reduce the energy consumption per bit in wireless sensor networks by reducing the amount of data for transmission and better using network resources through cooperative communication. For this purpose, we derive a new energy model that considers the correlation between data generated by nodes and the distance between them for a cluster-based sensor network by employing the combined techniques. Using this model, the effect of the cluster size on the average energy consumption per node can be analyzed. It is shown that the energy efficiency of the network can significantly be enhanced in cooperative MIMO systems with data aggregation, compared with either cooperative MIMO systems without data aggregation or data-aggregation systems without cooperative MIMO, if sensor nodes are properly clusterized. Both centralized and distributed data-aggregation schemes for the cooperating nodes to exchange and compress their data are also proposed and appraised, which lead to diverse impacts of data correlation on the energy performance of the integrated cooperative MIMO and data-aggregation systems.