iSAM2: Incremental Smoothing and Mapping Using the Bayes Tree - Robotics Institute Carnegie Mellon University

iSAM2: Incremental Smoothing and Mapping Using the Bayes Tree

Michael Kaess, Hordur Johannsson, Richard Roberts, Viorela Ila, John Leonard, and Frank Dellaert
Journal Article, International Journal of Robotics Research, Vol. 31, No. 2, pp. 217 - 236, February, 2012

Abstract

We present a novel data structure, the Bayes tree, that provides an algorithmic foundation enabling a better understanding of existing graphical model inference algorithms and their connection to sparse matrix factorization methods. Similar to a clique tree, a Bayes tree encodes a factored probability density, but unlike the clique tree it is directed and maps more naturally to the square root information matrix of the simultaneous localization and mapping (SLAM) problem. In this paper, we highlight three insights provided by our new data structure. First, the Bayes tree provides a better understanding of the matrix factorization in terms of probability densities. Second, we show how the fairly abstract updates to a matrix factorization translate to a simple editing of the Bayes tree and its conditional densities. Third, we apply the Bayes tree to obtain a completely novel algorithm for sparse nonlinear incremental optimization, named iSAM2, which achieves improvements in efficiency through incremental variable re-ordering and fluid relinearization, eliminating the need for periodic batch steps. We analyze various properties of iSAM2 in detail, and show on a range of real and simulated datasets that our algorithm compares favorably with other recent mapping algorithms in both quality and efficiency.

BibTeX

@article{Kaess-2012-7440,
author = {Michael Kaess and Hordur Johannsson and Richard Roberts and Viorela Ila and John Leonard and Frank Dellaert},
title = {iSAM2: Incremental Smoothing and Mapping Using the Bayes Tree},
journal = {International Journal of Robotics Research},
year = {2012},
month = {February},
volume = {31},
number = {2},
pages = {217 - 236},
keywords = {graphical models, clique tree, junction tree, probabilistic inference, sparse linear algebra, nonlinear optimization, smoothing and mapping, SLAM},
}