982 resultados para Multiple sensors


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Reliable robotic perception and planning are critical to performing autonomous actions in uncertain, unstructured environments. In field robotic systems, automation is achieved by interpreting exteroceptive sensor information to infer something about the world. This is then mapped to provide a consistent spatial context, so that actions can be planned around the predicted future interaction of the robot and the world. The whole system is as reliable as the weakest link in this chain. In this paper, the term mapping is used broadly to describe the transformation of range-based exteroceptive sensor data (such as LIDAR or stereo vision) to a fixed navigation frame, so that it can be used to form an internal representation of the environment. The coordinate transformation from the sensor frame to the navigation frame is analyzed to produce a spatial error model that captures the dominant geometric and temporal sources of mapping error. This allows the mapping accuracy to be calculated at run time. A generic extrinsic calibration method for exteroceptive range-based sensors is then presented to determine the sensor location and orientation. This allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. The mathematical derivations at the core of this model are not particularly novel or complicated, but the rigorous analysis and application to field robotics seems to be largely absent from the literature to date. The techniques in this paper are simple to implement, and they offer a significant improvement to the accuracy, precision, and integrity of mapped information. Consequently, they should be employed whenever maps are formed from range-based exteroceptive sensor data. © 2009 Wiley Periodicals, Inc.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The problem of structural system identification when measurements originate from multiple tests and multiple sensors is considered. An offline solution to this problem using bootstrap particle filtering is proposed. The central idea of the proposed method is the introduction of a dummy independent variable that allows for simultaneous assimilation of multiple measurements in a sequential manner. The method can treat linear/nonlinear structural models and allows for measurements on strains and displacements under static/dynamic loads. Illustrative examples consider measurement data from numerical models and also from laboratory experiments. The results from the proposed method are compared with those from a Kalman filter-based approach and the superior performance of the proposed method is demonstrated. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

During April 8th-10th, 2008, the Aliance for Coastal Technology (ACT) partner institutions, University of Alaska Fairbanks (UAF), Alaska SeaLife Center (ASLC), and the Oil Spill Recovery Institute (OSRI) hosted a workshop entitled: "Hydrocarbon sensors for oil spill prevention and response" in Seward, Alaska. The main focus was to bring together 29 workshop participants-representing workshop managers, scientists, and technology developers - together to discuss current and future hydrocarbon in-situ, laboratory, and remote sensors as they apply to oil spill prevention and response. [PDF contains 28 pages] Hydrocarbons and their derivatives still remain one of the most important energy sources in the world. To effectively manage these energy sources, proper protocol must be implemented to ensure prevention and responses to oil spills, as there are significant economic and environmental costs when oil spills occur. Hydrocarbon sensors provide the means to detect and monitor oil spills before, during, and after they occur. Capitalizing on the properties of oil, developers have designed in-situ, laboratory, and remote sensors that absorb or reflect the electromagnetic energy at different spectral bands. Workshop participants identified current hydrocarbon sensors (in-situ, laboratory, and remote sensors) and their overall performance. To achieve the most comprehensive understanding of oil spills, multiple sensors will be needed to gather oil spill extent, location, movement, thickness, condition, and classification. No single hydrocarbon sensor has the capability to collect all this information. Participants, therefore, suggested the development of means to combine sensor equipment to effectively and rapidly establish a spill response. As the exploration of oil continues at polar latitudes, sensor equipment must be developed to withstand harsh arctic climates, be able to detect oil under ice, and reduce the need for ground teams because ice extent is far too large of an area to cover. Participants also recognized the need for ground teams because ice extent is far too large of an area to cover. Participants also recognized the need for the U.S. to adopt a multi-agency cooperation for oil spill response, as the majority of issues surounding oil spill response focuses not on the hydrocarbon sensors but on an effective contingency plan adopted by all agencies. It is recommended that the U.S. could model contingency planning based on other nations such as Germany and Norway. Workshop participants were asked to make recommendations at the conclusion of the workshop and are summarized below without prioritization: *Outreach materials must be delivered to funding sources and Congressional delegates regarding the importance of oil spill prevention and response and the development of proper sensors to achieve effective response. *Develop protocols for training resource managers as new sensors become available. *Develop or adopt standard instrument specifications and testing protocols to assist manufacturers in further developing new sensor technology. *As oil exploration continues at polar latitudes, more research and development should be allocated to develop a suite of instruments that are applicable to oil detection under ice.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The integration of nanostructured films containing biomolecules and silicon-based technologies is a promising direction for reaching miniaturized biosensors that exhibit high sensitivity and selectivity. A challenge, however, is to avoid cross talk among sensing units in an array with multiple sensors located on a small area. In this letter, we describe an array of 16 sensing units, of a light-addressable potentiometric sensor (LAPS), which was made with layer-by-Layer (LbL) films of a poly(amidomine) dendrimer (PAMAM) and single-walled carbon nanotubes (SWNTs), coated with a layer of the enzyme penicillinase. A visual inspection of the data from constant-current measurements with liquid samples containing distinct concentrations of penicillin, glucose, or a buffer indicated a possible cross talk between units that contained penicillinase and those that did not. With the use of multidimensional data projection techniques, normally employed in information Visualization methods, we managed to distinguish the results from the modified LAPS, even in cases where the units were adjacent to each other. Furthermore, the plots generated with the interactive document map (IDMAP) projection technique enabled the distinction of the different concentrations of penicillin, from 5 mmol L(-1) down to 0.5 mmol L(-1). Data visualization also confirmed the enhanced performance of the sensing units containing carbon nanotubes, consistent with the analysis of results for LAPS sensors. The use of visual analytics, as with projection methods, may be essential to handle a large amount of data generated in multiple sensor arrays to achieve high performance in miniaturized systems.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A year of satellite-borne lidar CALIOP data is analyzed and statistics on occurrence and distribution of bulk properties of cirri are provided. The relationship between environmental and cloud physical parameters and the shape of the backscatter profile (BSP) is investigated. It is found that CALIOP BSP is mainly affected by cloud geometrical thickness while only minor impacts can be attributed to other quantities such as optical depth or temperature. To fit mean BSPs as functions of geometrical thickness and position within the cloud layer, polynomial functions are provided. It is demonstrated that, under realistic hypotheses, the mean BSP is linearly proportional to the IWC profile. The IWC parameterization is included into the RT-RET retrieval algorithm, that is exploited to analyze infrared radiance measurements in presence of cirrus clouds during the ECOWAR field campaign. Retrieved microphysical and optical properties of the observed cloud are used as input parameters in a forward RT simulation run over the 100-1100 cm-1 spectral interval and compared with interferometric data to test the ability of the current single scattering properties database of ice crystal to reproduce realistic optical features. Finally a global scale investigation of cirrus clouds is performed by developing a collocation algorithm that exploits satellite data from multiple sensors (AIRS, CALIOP, MODIS). The resulting data set is utilized to test a new infrared hyperspectral retrieval algorithm. Retrieval products are compared to data and in particular the cloud top height (CTH) product is considered for this purpose. A better agreement of the retrieval with the CALIOP CTH than MODIS is found, even if some cases of underestimation and overestimation are observed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The fabrication of in-fibre Bragg gratings (FBGs) and their application as sensors is reported. The strain and temperature characteristic results for a number of chirped and uniform gratings written into three different host fibres are presented. The static and dynamic temperature response of a commercially available temperature compensated grating is reported. A five sensor wavelength division multiplexed fibre Bragg grating strain measurement system with an interrogation rate of 25 Hz and resolution of 10 was constructed. The results from this system are presented. A novel chirped FBG interrogation method was implemented in both the 1.3 and 1.5 m telecommunication windows. Several single and dual strain sensor systems, employing this method, were constructed and the results obtained from each are reported and discussed. These systems are particularly suitable for the measurement of large strain. The results from a system measuring up to 12 m and with a potential measurement range of 30 m are reported. This technique is also shown to give an obtainable resolution of 20 over a measurement range of 5 000 for a dual sensor system. These systems are simple, robust, passive and easy to implement. They offer low cost, high speed and, in the case of multiple sensors, truly simultaneous interrogation. These advantages make this technique ideal for strain sensing in SMART structures. Systems based on this method have been installed in the masts of four superyachts. A system, based on this technique, is currently being developed for the measurement of acoustic waves in carbon composite panels. The results from an alternative method for interrogating uniform FBG sensors are also discussed. Interrogation of the gratings was facilitated by a specifically written asymmetric grating which had a 15 nm long linearly sloped spectral edge. This technique was employed to interrogate a single sensor over a measurement range of 6 m and two sensors over a range of 4.5 me. The results obtained indicated achievable resolutions of 47 and 38 respectively.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fiber optical sensors have played an important role in applications for monitoring the health of civil infrastructures, such as bridges, oil rigs, and railroads. Due to the reduction in cost of fiber-optic components and systems, fiber optical sensors have been studied extensively for their higher sensitivity, precision and immunity to electrical interference compared to their electrical counterparts. A fiber Bragg grating (FBG) strain sensor has been employed for this study to detect and distinguish normal and lateral loads on rail tracks. A theoretical analysis of the relationship between strain and displacement under vertical and horizontal strains on an aluminum beam has been performed, and the results are in excellent agreement with the measured strain data. Then a single FBG sensor system with erbium-doped fiber amplifier broadband source has been carried out. Force and temperature applied on the system have resulted in changes of 0.05 nm per 50 με and 0.094 nm per 10 oC at the center wavelength of the FBG. Furthermore, a low cost fiber-optic sensor system with a distributed feedback (DFB) laser as the light source has been implemented. We show that it has superior noise and sensitivity performances compared to strain gauge sensors. The design has been extended to accommodate multiple sensors with negligible cross talk. When two cascaded sensors on a rail track section are tested, strain readings of the sensor 20 inches away from the position of applied force decay to one seventh of the data of the sensor at the applied force location. The two FBG sensor systems can detect 1 ton of vertical load with a square wave pattern and 0.1 ton of lateral loads (3 tons and 0.5 ton, respectively, for strain gauges). Moreover, a single FBG sensor has been found capable of detecting and distinguishing lateral and normal strains applied at different frequencies. FBG sensors are promising alternatives to electrical sensors for their high sensitivity,ease of installation, and immunity to electromagnetic interferences.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper outlines a feasible scheme to extract deck trend when a rotary-wing unmanned aerial vehicle (RUAV)approaches an oscillating deck. An extended Kalman filter (EKF) is de- veloped to fuse measurements from multiple sensors for effective estimation of the unknown deck heave motion. Also, a recursive Prony Analysis (PA) procedure is proposed to implement online curve-fitting of the estimated heave mo- tion. The proposed PA constructs an appropriate model with parameters identified using the forgetting factor recursive least square (FFRLS)method. The deck trend is then extracted by separating dominant modes. Performance of the proposed procedure is evaluated using real ship motion data, and simulation results justify the suitability of the proposed method into safe landing of RUAVs operating in a maritime environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes the implementation of the first portable, embedded data acquisition unit (BabelFuse) that is able to acquire and timestamp generic sensor data and trigger General Purpose I/O (GPIO) events against a microsecond-accurate wirelessly-distributed ‘global’ clock. A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fast-moving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment especially if non-deterministic communication hardware (such as IEEE-802.11-based wireless) and inaccurate clock synchronisation protocols are used. The issue of differing timebases makes correlation of data difficult and prevents the units from reliably performing synchronised operations or manoeuvres. By utilising hardware-assisted timestamping, clock synchronisation protocols based on industry standards and firmware designed to minimise indeterminism, an embedded data acquisition unit capable of microsecond-level clock synchronisation is presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various intrusion detection systems (IDSs) reported in the literature have shown distinct preferences for detecting a certain class of attack with improved accuracy, while performing moderately on the other classes. In view of the enormous computing power available in the present-day processors, deploying multiple IDSs in the same network to obtain best-of-breed solutions has been attempted earlier. The paper presented here addresses the problem of optimizing the performance of IDSs using sensor fusion with multiple sensors. The trade-off between the detection rate and false alarms with multiple sensors is highlighted. It is illustrated that the performance of the detector is better when the fusion threshold is determined according to the Chebyshev inequality. In the proposed data-dependent decision ( DD) fusion method, the performance optimization of ndividual IDSs is first addressed. A neural network supervised learner has been designed to determine the weights of individual IDSs depending on their reliability in detecting a certain attack. The final stage of this DD fusion architecture is a sensor fusion unit which does the weighted aggregation in order to make an appropriate decision. This paper theoretically models the fusion of IDSs for the purpose of demonstrating the improvement in performance, supplemented with the empirical evaluation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we are concerned with energy efficient area monitoring using information coverage in wireless sensor networks, where collaboration among multiple sensors can enable accurate sensing of a point in a given area-to-monitor even if that point falls outside the physical coverage of all the sensors. We refer to any set of sensors that can collectively sense all points in the entire area-to-monitor as a full area information cover. We first propose a low-complexity heuristic algorithm to obtain full area information covers. Using these covers, we then obtain the optimum schedule for activating the sensing activity of various sensors that maximizes the sensing lifetime. The scheduling of sensor activity using the optimum schedules obtained using the proposed algorithm is shown to achieve significantly longer sensing lifetimes compared to those achieved using physical coverage. Relaxing the full area coverage requirement to a partial area coverage (e.g., 95% of area coverage as adequate instead of 100% area coverage) further enhances the lifetime.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The motivation behind the fusion of Intrusion Detection Systems was the realization that with the increasing traffic and increasing complexity of attacks, none of the present day stand-alone Intrusion Detection Systems can meet the high demand for a very high detection rate and an extremely low false positive rate. Multi-sensor fusion can be used to meet these requirements by a refinement of the combined response of different Intrusion Detection Systems. In this paper, we show the design technique of sensor fusion to best utilize the useful response from multiple sensors by an appropriate adjustment of the fusion threshold. The threshold is generally chosen according to the past experiences or by an expert system. In this paper, we show that the choice of the threshold bounds according to the Chebyshev inequality principle performs better. This approach also helps to solve the problem of scalability and has the advantage of failsafe capability. This paper theoretically models the fusion of Intrusion Detection Systems for the purpose of proving the improvement in performance, supplemented with the empirical evaluation. The combination of complementary sensors is shown to detect more attacks than the individual components. Since the individual sensors chosen detect sufficiently different attacks, their result can be merged for improved performance. The combination is done in different ways like (i) taking all the alarms from each system and avoiding duplications, (ii) taking alarms from each system by fixing threshold bounds, and (iii) rule-based fusion with a priori knowledge of the individual sensor performance. A number of evaluation metrics are used, and the results indicate that there is an overall enhancement in the performance of the combined detector using sensor fusion incorporating the threshold bounds and significantly better performance using simple rule-based fusion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.