Active Visual Features Based on Events to Guide Robot Manipulators in Tracking Tasks


Autoria(s): Gil, Pablo; García Gómez, Gabriel Jesús; Mateo Agulló, Carlos; Torres Medina, Fernando
Contribuinte(s)

Universidad de Alicante. Departamento de Física, Ingeniería de Sistemas y Teoría de la Señal

Automática, Robótica y Visión Artificial

Data(s)

04/03/2015

04/03/2015

2014

24/08/2014

Resumo

Traditional visual servoing systems do not deal with the topic of moving objects tracking. When these systems are employed to track a moving object, depending on the object velocity, visual features can go out of the image, causing the fail of the tracking task. This occurs specially when the object and the robot are both stopped and then the object starts the movement. In this work, we have employed a retina camera based on Address Event Representation (AER) in order to use events as input in the visual servoing system. The events launched by the camera indicate a pixel movement. Event visual information is processed only at the moment it occurs, reducing the response time of visual servoing systems when they are used to track moving objects.

The research leading to these results has received funding from the Spanish Government and European FEDER funds (DPI2012-32390), the Valencia Regional Government (GV2012/102 and PROMETEO/2013/085) and the University of Alicante (GRE12-17).

Identificador

Proceedings of the 19th IFAC World Congress, 2014. Boje, Edward; Xia, Xiaohua (Eds.). IFAC, 2014, pp. 11890-11897

978-3-902823-62-5

1474-6670

http://hdl.handle.net/10045/45465

10.3182/20140824-6-ZA-1003.02151

Idioma(s)

eng

Publicador

IFAC

Relação

http://dx.doi.org/10.3182/20140824-6-ZA-1003.02151

Direitos

© 2014 IFAC

info:eu-repo/semantics/openAccess

Palavras-Chave #Guidance navigation #Robot manipulators #Autonomous robotic systems #Event control #AER #Asynchronous vision sensor #Visual servoing #Image motion #Robot vision #Ingeniería de Sistemas y Automática
Tipo

info:eu-repo/semantics/conferenceObject