Home/A Real-time Method for Depth Enhanced Monocular Odometry

A Real-time Method for Depth Enhanced Monocular Odometry

Ji Zhang, Michael Kaess and Sanjiv Singh
Journal Article, Autonomous Robots, AURO, Vol. 41, No. 1, pp. 31-43, January, 2017

Download Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.


Visual odometry can be augmented by depth information such as provided by RGB-D cameras, or from lidars associated with cameras. However, such depth information can be limited by the sensors, leaving large areas in the visual images where depth is unavailable. Here, we propose a method to utilize the depth, even if sparsely available, in recovery of camera motion. In addition, the method utilizes depth by structure from motion using the previously estimated motion, and salient visual features for which depth is unavailable. Therefore, the method is able to extend RGB-D visual odometry to large scale, open environments where depth often cannot be sufficiently acquired. The core of our method is a bundle adjustment step that refines the motion estimates in parallel by processing a sequence of images, in a batch optimization. We have evaluated our method in three sensor setups, one using an RGB-D camera, and two using combinations of a camera and a 3D lidar. Our method is rated #4 on the KITTI odometry benchmark irrespective of sensing modality-compared to stereo visual odometry methods which retrieve depth by triangulation. The resulting average position error is 1.14 % of the distance traveled.

author = {Ji Zhang and Michael Kaess and Sanjiv Singh},
title = {A Real-time Method for Depth Enhanced Monocular Odometry},
journal = {Autonomous Robots, AURO},
year = {2017},
month = {January},
volume = {41},
number = {1},
pages = {31-43},
} 2017-09-13T10:38:10-04:00