Direct Disparity Space: Robust and Real-time Visual Odometry - Robotics Institute Carnegie Mellon University

Direct Disparity Space: Robust and Real-time Visual Odometry

Hatem Said Alismail and Brett Browning
Tech. Report, CMU-RI-TR-14-20, Robotics Institute, Carnegie Mellon University, October, 2014

Abstract

We present a direct visual odometry formulation using a warping function in dis- parity space. In disparity space measurement noise is well-modeled by a Gaussian distribution, in contrast to the heteroscedastic noise in 3D space. In addition, the Ja- cobian of the warp separates the rotation and translation terms, enabling motion to be estimated from all image points even those located at infinity. Furthermore, we show that direct camera tracking can obtain accurate and robust performance using only a fraction of the image pixels through a simple and efficient pixel selection strat- egy. Our approach allows faster than real-time computation on a single CPU core with unoptimized code. As our approach does not rely on feature extraction, the selected pixels over succes- sive frames are often unique. Hence, triangulating the selected pixels to the world frame produces an accurate and dense 3D reconstruction with minimal computa- tional cost making it appealing to robotics and embedded applications. We evaluate the performance of our approach against state-of-the-art methods on a range of urban and indoor datasets. We show that our algorithm produces competi- tive performance, requires no specialized tuning, and continues to produce compet- itive results even when run with low resolution images where other techniques fail to operate.

BibTeX

@techreport{Alismail-2014-7944,
author = {Hatem Said Alismail and Brett Browning},
title = {Direct Disparity Space: Robust and Real-time Visual Odometry},
year = {2014},
month = {October},
institute = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-14-20},
keywords = {visual odometry, stereo, disparity space, pose estimation},
}