Omnidirectional Visual Odometry for a Planetary Rover

Peter Ian Corke, Dennis Strelow, and Sanjiv Singh
Proceedings of IROS 2004, 2004.


Download
  • Adobe portable document format (pdf) (338KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
Position estimation for planetary rovers has been typically limited to odometry based on proprioceptive measurements such as the integration of distance traveled and measurement of heading change. Here we present and compare two methods of online visual odometry suited for planetary rovers. Both methods use omnidirectional imagery to estimate motion of the rover. One method is based on robust estimation of optical flow and subsequent integration of the flow. The second method is a full structure-from-motion solution. To make the comparison meaningful we use the same set of raw corresponding visual features for each method. The dataset is an sequence of 2000 images taken during a field experiment in the Atacama desert, for which high resolution GPS ground truth is available.

Notes
Associated Center(s) / Consortia: Field Robotics Center
Associated Project(s): Sun Synchronous Navigation and VISTA
Number of pages: 6

Text Reference
Peter Ian Corke, Dennis Strelow, and Sanjiv Singh, "Omnidirectional Visual Odometry for a Planetary Rover," Proceedings of IROS 2004, 2004.

BibTeX Reference
@inproceedings{Corke_2004_4913,
   author = "Peter Ian Corke and Dennis Strelow and Sanjiv Singh",
   title = "Omnidirectional Visual Odometry for a Planetary Rover",
   booktitle = "Proceedings of IROS 2004",
   year = "2004",
}