993 resultados para sensor uncertainty


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many CCTV and sensor network based intelligent surveillance systems, a number of attributes or criteria are used to individually evaluate the degree of potential threat of a suspect. The outcomes for these attributes are in general from analytical algorithms where data are often pervaded with uncertainty and incompleteness. As a result, such individual threat evaluations are often inconsistent, and individual evaluations can change as time elapses. Therefore, integrating heterogeneous threat evaluations with temporal influence to obtain a better overall evaluation is a challenging issue. So far, this issue has rarely be considered by existing event reasoning frameworks under uncertainty in sensor network based surveillance. In this paper, we first propose a weighted aggregation operator based on a set of principles that constraints the fusion of individual threat evaluations. Then, we propose a method to integrate the temporal influence on threat evaluation changes. Finally, we demonstrate the usefulness of our system with a decision support event modeling framework using an airport security surveillance scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been much interest in the belief–desire–intention (BDI) agent-based model for developing scalable intelligent systems, e.g. using the AgentSpeak framework. However, reasoning from sensor information in these large-scale systems remains a significant challenge. For example, agents may be faced with information from heterogeneous sources which is uncertain and incomplete, while the sources themselves may be unreliable or conflicting. In order to derive meaningful conclusions, it is important that such information be correctly modelled and combined. In this paper, we choose to model uncertain sensor information in Dempster–Shafer (DS) theory. Unfortunately, as in other uncertainty theories, simple combination strategies in DS theory are often too restrictive (losing valuable information) or too permissive (resulting in ignorance). For this reason, we investigate how a context-dependent strategy originally defined for possibility theory can be adapted to DS theory. In particular, we use the notion of largely partially maximal consistent subsets (LPMCSes) to characterise the context for when to use Dempster’s original rule of combination and for when to resort to an alternative. To guide this process, we identify existing measures of similarity and conflict for finding LPMCSes along with quality of information heuristics to ensure that LPMCSes are formed around high-quality information. We then propose an intelligent sensor model for integrating this information into the AgentSpeak framework which is responsible for applying evidence propagation to construct compatible information, for performing context-dependent combination and for deriving beliefs for revising an agent’s belief base. Finally, we present a power grid scenario inspired by a real-world case study to demonstrate our work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) are used in health monitoring, tracking and security applications. Such networks transfer data from specific areas to a nominated destination. In the network, each sensor node acts as a routing element for other sensor nodes during the transmission of data. This can increase energy consumption of the sensor node. In this paper, we propose a routing protocol for improving network lifetime and performance. The proposed protocol uses type-2 fuzzy logic to minimize the effects of uncertainty produced by the environmental noise. Simulation results show that the proposed protocol performs better than a recently developed routing protocol in terms of extending network lifetime and saving energy and also reducing data packet lost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerodynamic balances are employed in wind tunnels to estimate the forces and moments acting on the model under test. This paper proposes a methodology for the assessment of uncertainty in the calibration of an internal multi-component aerodynamic balance. In order to obtain a suitable model to provide aerodynamic loads from the balance sensor responses, a calibration is performed prior to the tests by applying known weights to the balance. A multivariate polynomial fitting by the least squares method is used to interpolate the calibration data points. The uncertainties of both the applied loads and the readings of the sensors are considered in the regression. The data reduction includes the estimation of the calibration coefficients, the predicted values of the load components and their corresponding uncertainties, as well as the goodness of fit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gas sensors have been used widely in different important area including industrial control, environmental monitoring, counter-terrorism and chemical production. Micro-fabrication offers a promising way to achieve sensitive and inexpensive gas sensors. Over the years, various MEMS gas sensors have been investigated and fabricated. One significant type of MEMS gas sensors is based on mass change detection and the integration with specific polymer. This dissertation aims to make contributions to the design and fabrication of MEMS resonant mass sensors with capacitance actuation and sensing that lead to improved sensitivity. To accomplish this goal, the research has several objectives: (1) Define an effective measure for evaluating the sensitivity of resonant mass devices; (2) Model the effects of air damping on microcantilevers and validate models using laser measurement system (3) Develop design guidelines for improving sensitivity in the presence of air damping; (4) Characterize the degree of uncertainty in performance arising from fabrication variation for one or more process sequences, and establish design guidelines for improved robustness. Work has been completed toward these objectives. An evaluation measure has been developed and compared to an RMS based measure. Analytic models of air damping for parallel plate that include holes are compared with a COMSOL model. The models have been used to identify cantilever design parameters that maximize sensitivity. Additional designs have been modeled with COMSOL and the development of an analytical model for Fixed-free cantilever geometries with holes has been developed. Two process flows have been implemented and compared. A number of cantilever designs have been fabricated and the uncertainty in process has been investigated. Variability from processing have been evaluated and characterized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of the interference modes has an increasing application, especially in the field of optical biosensors. In this type of sensors, the displacement Δν of the interference modes of the transduction signal is observed when a particular biological agent is placed over the biosensor. In order to measure this displacement, the position of a maximum (or a minimum) of the signal must be detected before and after placing the agent over the sensor. A parameter of great importance for this kind of sensors is the period Pν of the signal, which is inversely proportional to the optical thickness h0 of the sensor in the absence of the biological agent. The increase of this period improves the sensitivity of the sensor but it worsens the detection of the maximum. In this paper, authors analyze the propagation of uncertainties in these sensors when using least squares techniques for the detection of the maxima (or minima) of the signal. Techniques described in supplement 2 of the ISO-GUM Guide are used. The result of the analysis allows a metrological educated answer to the question of which is the optimal period Pν of the signal. El análisis del comportamiento de los modos de interferencia tiene una aplicación cada vez más amplia, especialmente en el campo de los biosensores ópticos. En este tipo de sensores se observa el desplazamiento Δν de los modos de interferencia de la señal de transducción al reconocer un de-terminado agente biológico. Para medir ese desplazamiento se debe detectar la posición de un máximo o mínimo de la señal antes y después de dicho desplazamiento. En este tipo de biosensores un parámetro de gran importancia es el periodo Pν de la señal el cual es inversamente proporcional al espesor óptico h0 del sensor en ausencia de agente biológico. El aumento de dicho periodo mejora la sensibilidad del sensor pero parece dificultar la detección del mínimo o máximo. Por tanto, su efecto sobre la incertidumbre del resultado de la medida presenta dos efectos contrapuestos: la mejora de la sensibilidad frente a la dificultad creciente en la detección del mínimo ó máximo. En este trabajo, los autores analizan la propagación de incertidumbres en estos sensores utilizando herramientas de ajuste por MM.CC. para la detección de los mínimos o máximos de la señal y técnicas de propagación de incertidumbres descritas en el suplemento 2 de la Guía ISO-GUM. El resultado del análisis permite dar una respuesta, justificada desde el punto de vista metrológico, de en que condiciones es conveniente o no aumentar el periodo Pν de la señal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La temperatura superficial del mar (SST) estimada a partir de los productos 11 μm diurnos y nocturnos y 4 μm nocturnos del sensor MODIS (Moderate Resolution Imaging Spectroradiometer) a bordo de la plataforma Aqua, han sido comparados con datos medidos in situ a tres profundidades diferentes (15, 50 y 100 cm) en una zona costera del Mediterráneo Occidental. Esta comparación ha permitido analizar la incertidumbre que existe en la estimación de este parámetro en aguas someras y próximas a la costa mediante imágenes de satélite de baja resolución espacial. Los resultados obtenidos demuestran que el producto diurno SST_11 μm, obtiene los estadísticos RMSE (error cuadrático medio) y r2 (coeficiente de correlación de Pearson) más ajustados con valores de 1°C y 0,96, respectivamente, para la profundidad 50 cm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A distributed temperature sensor for transient threshold monitoring with a 22 km sensing length, based on the Brillouin loss in standard communications fibre, is demonstrated. The system can be used for real-time monitoring of a preset temperature threshold. Good S/N ratios were achieved with only 8–16 sample averages giving a response time of 2 to 4 s with a temperature uncertainty of ±1 °C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A distributed temperature sensor for transient threshold monitoring with a 22 km sensing length, based on the Brillouin loss in standard communications fibre, is demonstrated. The system can be used for real-time monitoring of a preset temperature threshold. Good S/N ratios were achieved with only 8–16 sample averages giving a response time of 2 to 4 s with a temperature uncertainty of ±1 °C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian nonparametric models, such as the Gaussian process and the Dirichlet process, have been extensively applied for target kinematics modeling in various applications including environmental monitoring, traffic planning, endangered species tracking, dynamic scene analysis, autonomous robot navigation, and human motion modeling. As shown by these successful applications, Bayesian nonparametric models are able to adjust their complexities adaptively from data as necessary, and are resistant to overfitting or underfitting. However, most existing works assume that the sensor measurements used to learn the Bayesian nonparametric target kinematics models are obtained a priori or that the target kinematics can be measured by the sensor at any given time throughout the task. Little work has been done for controlling the sensor with bounded field of view to obtain measurements of mobile targets that are most informative for reducing the uncertainty of the Bayesian nonparametric models. To present the systematic sensor planning approach to leaning Bayesian nonparametric models, the Gaussian process target kinematics model is introduced at first, which is capable of describing time-invariant spatial phenomena, such as ocean currents, temperature distributions and wind velocity fields. The Dirichlet process-Gaussian process target kinematics model is subsequently discussed for modeling mixture of mobile targets, such as pedestrian motion patterns.

Novel information theoretic functions are developed for these introduced Bayesian nonparametric target kinematics models to represent the expected utility of measurements as a function of sensor control inputs and random environmental variables. A Gaussian process expected Kullback Leibler divergence is developed as the expectation of the KL divergence between the current (prior) and posterior Gaussian process target kinematics models with respect to the future measurements. Then, this approach is extended to develop a new information value function that can be used to estimate target kinematics described by a Dirichlet process-Gaussian process mixture model. A theorem is proposed that shows the novel information theoretic functions are bounded. Based on this theorem, efficient estimators of the new information theoretic functions are designed, which are proved to be unbiased with the variance of the resultant approximation error decreasing linearly as the number of samples increases. Computational complexities for optimizing the novel information theoretic functions under sensor dynamics constraints are studied, and are proved to be NP-hard. A cumulative lower bound is then proposed to reduce the computational complexity to polynomial time.

Three sensor planning algorithms are developed according to the assumptions on the target kinematics and the sensor dynamics. For problems where the control space of the sensor is discrete, a greedy algorithm is proposed. The efficiency of the greedy algorithm is demonstrated by a numerical experiment with data of ocean currents obtained by moored buoys. A sweep line algorithm is developed for applications where the sensor control space is continuous and unconstrained. Synthetic simulations as well as physical experiments with ground robots and a surveillance camera are conducted to evaluate the performance of the sweep line algorithm. Moreover, a lexicographic algorithm is designed based on the cumulative lower bound of the novel information theoretic functions, for the scenario where the sensor dynamics are constrained. Numerical experiments with real data collected from indoor pedestrians by a commercial pan-tilt camera are performed to examine the lexicographic algorithm. Results from both the numerical simulations and the physical experiments show that the three sensor planning algorithms proposed in this dissertation based on the novel information theoretic functions are superior at learning the target kinematics with

little or no prior knowledge

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous sensor stream data are often recorded as a series of discrete points in a database from which knowledge can be retrieved through queries. Two classes of uncertainties inevitably happen in sensor streams that we present as follows. The first is Uncertainty due to Discrete Sampling (DS Uncertainty); even if every discrete point is correct, the discrete sensor stream is uncertain – that is, it is not exactly like the continuous stream – since some critical points are missing due to the limited capabilities of the sensing equipment and the database server. The second is Uncertainty due to Sampling Error (SE Uncertainty); sensor readings for the same situation cannot be repeated exactly when we record them at different times or use different sensors since different sampling errors exist. These two uncertainties reduce the efficiency and accuracy of querying common patterns. However, already known algorithms generally only resolve SE Uncertainty. In this paper, we propose a novel method of Correcting Imprecise Readings and Compressing Excrescent (CIRCE) points. Particularly, to resolve DS Uncertainty, a novel CIRCE core algorithm is developed in the CIRCE method to correct the missing critical points while compressing the original sensor streams. The experimental study based on various sizes of sensor stream datasets validates that the CIRCE core algorithm is more efficient and more accurate than a counterpart algorithm to compress sensor streams. We also resolve the SE Uncertainty problem in the CIRCE method. The application for querying longest common route patterns validates the effectiveness of our CIRCE method.