///MSR Thesis Talk – Weizhao Shao
Loading Events
This event has passed.

MSR Speaking Qualifier


Shawn (Weizhao) Shao MSR Student Robotics Institute,
Carnegie Mellon University
Tuesday, June 18
10:30 am
- 12:00 pm
NSH 4305
MSR Thesis Talk – Weizhao Shao

Title: Stereo Visual-Inertial-LiDAR Simultaneous Localization and Mapping



Simultaneous Localization and Mapping (SLAM) is a fundamental task to mobile and aerial robotics. The goal of SLAM is to utilize onboard sensors for estimating the robot’s trajectory while reconstructing the surrounding environment (map) in real-time. The algorithm should also be able to perform loop closure, such that it could detect if the same environments are revisited again and hence eliminate drifts over the loop. SLAM has been an appealing field of research over the past decades, for one it is a great mix of probabilistic estimation, optimization, and geometry. For two, it is practically useful but hard, as it involves tasks from sensor calibrations to system integration.

The community has been investigating different sensor modalities and exploit- ing their benefits. LiDAR-based systems have proven to be accurate and robust in most scenarios. However, pure LiDAR-based systems fail in certain degenerate cases like traveling through featureless tunnels or straight hallways. Vision-based systems are efficient and lightweight. However, they depend on good data associations to perform well and thus fail terribly in environments without many visual clues. Inertial Measurement Unit (IMU) produces high-frequency measurements, which are reasonable for a short interval but quickly drift.

In this thesis, I investigate the fusion of LiDAR, camera, and IMU for SLAM. I will begin with my implementation of a stereo visual inertial odometry (VIO). Then I will discuss two coupling strategies between the VIO and a LiDAR mapping method. I will also present a LiDAR enhanced visual loop closure system to fully exploit the benefits of the sensor suite. The complete SLAM pipeline generates loop-closure corrected 6-DOF LiDAR poses in real-time and 1cm voxel dense maps near real-time. It demonstrates improved accuracy and robustness compared to state-of-the-art LiDAR methods. Evaluations are performed on representative public datasets and custom collected datasets from diverse environments.



George Kantor (Advisor)

Michael Kaess

Eric Westman