A sparse hybrid map for vision-guided mobile robots
Data(s) |
2011
|
---|---|
Resumo |
This paper introduces a minimalistic approach to produce a visual hybrid map of a mobile robot’s working environment. The proposed system uses omnidirectional images along with odometry information to build an initial dense posegraph map. Then a two level hybrid map is extracted from the dense graph. The hybrid map consists of global and local levels. The global level contains a sparse topological map extracted from the initial graph using a dual clustering approach. The local level contains a spherical view stored at each node of the global level. The spherical views provide both an appearance signature for the nodes, which the robot uses to localize itself in the environment, and heading information when the robot uses the map for visual navigation. In order to show the usefulness of the map, an experiment was conducted where the map was used for multiple visual navigation tasks inside an office workplace. |
Formato |
application/pdf |
Identificador | |
Relação |
http://eprints.qut.edu.au/72973/1/ECMR2011_0014_web.pdf http://aass.oru.se/Agora/ECMR2011/proceedings.html Dayoub, Feras, Cielniak, Grzegorz, & Duckett, Tom (2011) A sparse hybrid map for vision-guided mobile robots. In 5th European Conference on Mobile Robots (ECMR 2011), 7-9 September 2011, Örebro, Sweden. |
Direitos |
Copyright 2011 [please consult the author] |
Fonte |
School of Electrical Engineering & Computer Science; Faculty of Science and Technology |
Tipo |
Conference Paper |