AirDOS: Dynamic SLAM benefits from Articulated Objects - Robotics Institute Carnegie Mellon University

AirDOS: Dynamic SLAM benefits from Articulated Objects

Yuheng Qiu, Chen Wang, Wenshan Wang, Mina Henein, and Sebastian Scherer
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, May, 2022

Abstract

Dynamic Object-aware SLAM (DOS) exploits object-level information to enable robust motion estimation in dynamic environments. Existing methods mainly focus on identifying and excluding dynamic objects from the optimization. In this paper, we show that feature-based visual SLAM systems can also benefit from the presence of dynamic articulated objects by taking advantage of two observations: (1) The 3D structure of each rigid part of the articulated object remains consistent over time; (2) The points on the same rigid part follow the same motion. In particular, we present AirDOS, a dynamic object-aware system that introduces rigidity and motion constraints to model articulated objects. By jointly optimizing the camera pose, object motion, and the object 3D structure, we can rectify the camera pose estimation, preventing tracking loss, and generate 4D spatio-temporal maps for both dynamic objects and static scenes. Experiments show that our algorithm improves the robustness of visual SLAM algorithms in challenging crowded urban environments. To the best of our knowledge, AirDOS is the first dynamic object-aware SLAM system demonstrating that camera pose estimation can be improved by incorporating dynamic articulated objects.

BibTeX

@conference{Qiu-2022-131125,
author = {Yuheng Qiu and Chen Wang and Wenshan Wang and Mina Henein and Sebastian Scherer},
title = {AirDOS: Dynamic SLAM benefits from Articulated Objects},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2022},
month = {May},
}