Omnidirectional visual odometry for a planetary rover


Autoria(s): Corke, Peter; Strelow, D.; Singh, Sanjiv
Data(s)

2004

Resumo

Position estimation for planetary rovers has been typically limited to odometry based on proprioceptive measurements such as the integration of distance traveled and measurement of heading change. Here we present and compare two methods of online visual odometry suited for planetary rovers. Both methods use omnidirectional imagery to estimate motion of the rover. One method is based on robust estimation of optical flow and subsequent integration of the flow. The second method is a full structure-from-motion solution. To make the comparison meaningful we use the same set of raw corresponding visual features for each method. The dataset is an sequence of 2000 images taken during a field experiment in the Atacama desert, for which high resolution GPS ground truth is available.

Formato

application/pdf

Identificador

http://eprints.qut.edu.au/32778/

Relação

http://eprints.qut.edu.au/32778/1/32778_corke.pdf

Corke, Peter, Strelow, D., & Singh, Sanjiv (2004) Omnidirectional visual odometry for a planetary rover. In 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

Fonte

Faculty of Built Environment and Engineering; School of Engineering Systems

Palavras-Chave #090602 Control Systems Robotics and Automation #robotics #camera motion
Tipo

Conference Paper