Laser-visual-inertial Odometry and Mapping with High Robustness and Low Drift - Robotics Institute Carnegie Mellon University

Laser-visual-inertial Odometry and Mapping with High Robustness and Low Drift

Journal Article, Journal of Field Robotics, Vol. 35, No. 8, pp. 1242 - 1264, August, 2018

Abstract

We present a data processing pipeline to online estimate ego‐motion and build a map of the traversed environment, leveraging data from a 3D laser scanner, a camera, and an inertial measurement unit (IMU). Different from traditional methods that use a Kalman filter or factor‐graph optimization, the proposed method employs a sequential, multilayer processing pipeline, solving for motion from coarse to fine. Starting with IMU mechanization for motion prediction, a visual–inertial coupled method estimates motion; then, a scan matching method further refines the motion estimates and registers maps. The resulting system enables high‐frequency, low‐latency ego‐motion estimation, along with dense, accurate 3D map registration. Further, the method is capable of handling sensor degradation by automatic reconfiguration bypassing failure modules. Therefore, it can operate in the presence of highly dynamic motion as well as in the dark, texture‐less, and structure‐less environments. During experiments, the method demonstrates 0.22% of relative position drift over 9.3 km of navigation and robustness w.r.t. running, jumping, and even highway speed driving (up to 33 m/s).

BibTeX

@article{Zhang-2018-110816,
author = {Ji Zhang and Sanjiv Singh},
title = {Laser-visual-inertial Odometry and Mapping with High Robustness and Low Drift},
journal = {Journal of Field Robotics},
year = {2018},
month = {August},
volume = {35},
number = {8},
pages = {1242 - 1264},
keywords = {mapping, ego‐motion estimation, 3D laser scanner, vision, inertial sensor},
}