Omnidirectional Visual Odometry for a Planetary Rover

Peter Ian Corke, Dennis Strelow and Sanjiv Singh
Conference Paper, Proceedings of IROS 2004, January, 2004

View Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.


Position estimation for planetary rovers has been typically limited to odometry based on proprioceptive measurements such as the integration of distance traveled and measurement of heading change. Here we present and compare two methods of online visual odometry suited for planetary rovers. Both methods use omnidirectional imagery to estimate motion of the rover. One method is based on robust estimation of optical flow and subsequent integration of the flow. The second method is a full structure-from-motion solution. To make the comparison meaningful we use the same set of raw corresponding visual features for each method. The dataset is an sequence of 2000 images taken during a field experiment in the Atacama desert, for which high resolution GPS ground truth is available.

author = {Peter Ian Corke and Dennis Strelow and Sanjiv Singh},
title = {Omnidirectional Visual Odometry for a Planetary Rover},
booktitle = {Proceedings of IROS 2004},
year = {2004},
month = {January},
} 2017-09-13T10:44:13-04:00