Omnidirectional Visual Odometry for a Planetary Rover - Robotics Institute Carnegie Mellon University

Omnidirectional Visual Odometry for a Planetary Rover

Peter Ian Corke, Dennis Strelow, and Sanjiv Singh
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 4, pp. 4007 - 4012, September, 2004

Abstract

Position estimation for planetary rovers has been typically limited to odometry based on proprioceptive measurements such as the integration of distance traveled and measurement of heading change. Here we present and compare two methods of online visual odometry suited for planetary rovers. Both methods use omnidirectional imagery to estimate motion of the rover. One method is based on robust estimation of optical flow and subsequent integration of the flow. The second method is a full structure-from-motion solution. To make the comparison meaningful we use the same set of raw corresponding visual features for each method. The dataset is an sequence of 2000 images taken during a field experiment in the Atacama desert, for which high resolution GPS ground truth is available.

BibTeX

@conference{Corke-2004-16938,
author = {Peter Ian Corke and Dennis Strelow and Sanjiv Singh},
title = {Omnidirectional Visual Odometry for a Planetary Rover},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2004},
month = {September},
volume = {4},
pages = {4007 - 4012},
}