Information-Theoretic Online Multi-Camera Extrinsic Calibration - Robotics Institute Carnegie Mellon University

Information-Theoretic Online Multi-Camera Extrinsic Calibration

Eric Dexheimer, Patrick Peluse, Jianhui Chen, James Pritts, and Michael Kaess
Journal Article, IEEE Robotics and Automation Letters, Vol. 7, No. 2, pp. 4757 - 4764, April, 2022

Abstract

Calibration of multi-camera systems is essential for lifelong use of vision-based headsets and autonomous robots. In this work, we present an information-based framework for online extrinsic calibration of multi-camera systems. While previous work largely focuses on monocular, stereo, or strictly non-overlapping field-of-view (FoV) setups, we allow arbitrary configurations while also exploiting overlapping pairwise FoV when possible. In order to efficiently solve for the extrinsic calibration parameters, which increase linearly with the number of cameras, we propose a novel entropy-based keyframe measure and bound the backend optimization complexity by selecting informative motion segments that minimize the maximum entropy across all extrinsic parameter partitions. We validate the pipeline on three distinct platforms to demonstrate the generality of the method for resolving the extrinsics and performing downstream tasks. Our code is available at https://github.com/edexheim/info_ext_calib.

BibTeX

@article{Dexheimer-2022-134133,
author = {Eric Dexheimer and Patrick Peluse and Jianhui Chen and James Pritts and Michael Kaess},
title = {Information-Theoretic Online Multi-Camera Extrinsic Calibration},
journal = {IEEE Robotics and Automation Letters},
year = {2022},
month = {April},
volume = {7},
number = {2},
pages = {4757 - 4764},
}