4 resultados para Sensor data fusion

em Digital Peer Publishing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

To master changing performance demands, autonomous transport vehicles are deployed to make inhouse material flow applications more flexible. The socalled cellular transport system consists of a multitude of small scale transport vehicles which shall be able to form a swarm. Therefore the vehicles need to detect each other, exchange information amongst each other and sense their environment. By provision of peripherally acquired information of other transport entities, more convenient decisions can be made in terms of navigation and collision avoidance. This paper is a contribution to collective utilization of sensor data in the swarm of cellular transport vehicles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a user supported tracking framework that combines automatic tracking with extended user input to create error free tracking results that are suitable for interactive video production. The goal of our approach is to keep the necessary user input as small as possible. In our framework, the user can select between different tracking algorithms - existing ones and new ones that are described in this paper. Furthermore, the user can automatically fuse the results of different tracking algorithms with our robust fusion approach. The tracked object can be marked in more than one frame, which can significantly improve the tracking result. After tracking, the user can validate the results in an easy way, thanks to the support of a powerful interpolation technique. The tracking results are iteratively improved until the complete track has been found. After the iterative editing process the tracking result of each object is stored in an interactive video file that can be loaded by our player for interactive videos.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In diesem Beitrag wird eine neue Methode zur Analyse des manuellen Kommissionierprozesses vorgestellt, mit der u. a. die Kommissionierzeitanteile automatisch erfasst werden können. Diese Methode basiert auf einer sensorgestützten Bewegungsklassifikation, wie sie bspw. im Sport oder in der Medizin Anwendung findet. Dabei werden mobile Sensoren genutzt, die fortlaufend Messwerte wie z. B. die Beschleunigung oder die Drehgeschwindigkeit des Kommissionierers aufzeichnen. Auf Basis dieser Daten können Informationen über die ausgeführten Bewegungen und insbesondere über die durchlaufenen Bewegungszustände gewonnen werden. Dieser Ansatz wird im vorliegenden Beitrag auf die Kommissionierung übertragen. Dazu werden zunächst Klassen relevanter Bewegungen identifiziert und anschließend mit Verfahren aus dem maschinellen Lernen verarbeitet. Die Klassifikation erfolgt nach dem Prinzip des überwachten Lernens. Dabei werden durchschnittliche Erkennungsraten von bis zu 78,94 Prozent erzielt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups. A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated. To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.