Visual-lidar Odometry and Mapping: Low-drift, Robust, and Fast - Robotics Institute Carnegie Mellon University

Visual-lidar Odometry and Mapping: Low-drift, Robust, and Fast

Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 2174 - 2181, May, 2015

Abstract

Here, we present a general framework for combining visual odometry and lidar odometry in a fundamental and first principle method. The method shows improvements in performance over the state of the art, particularly in robustness to aggressive motion and temporary lack of visual features. The proposed on-line method starts with visual odometry to estimate the ego-motion and to register point clouds from a scanning lidar at a high frequency but low fidelity. Then, scan matching based lidar odometry refines the motion estimation and point cloud registration simultaneously.We show results with datasets collected in our own experiments as well as using the KITTI odometry benchmark. Our proposed method is ranked #1 on the benchmark in terms of average translation and rotation errors, with a 0.75% of relative position drift. In addition to comparison of the motion estimation accuracy, we evaluate robustness of the method when the sensor suite moves at a high speed and is subject to significant ambient lighting changes.

BibTeX

@conference{Zhang-2015-5940,
author = {Ji Zhang and Sanjiv Singh},
title = {Visual-lidar Odometry and Mapping: Low-drift, Robust, and Fast},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2015},
month = {May},
pages = {2174 - 2181},
}