Visual Pipe Mapping with a Fisheye Camera - Robotics Institute Carnegie Mellon University

Visual Pipe Mapping with a Fisheye Camera

Peter Hansen, Hatem Said Alismail, Peter Rander, and Brett Browning
Tech. Report, CMU-RI-TR-13-02, Robotics Institute, Carnegie Mellon University, February, 2013

Abstract

We present a vision-based mapping and localization system for operations in pipes such as those found in Liquified Natural Gas (LNG) production. A forward facing, fisheye camera mounted on a prototype robot collects imagery as it is tele-operated through a pipe network. The images are processed offline to estimate camera pose and sparse scene structure where the results can be used to generate 3D renderings of the pipe surface. The method extends state of the art visual odometry and mapping for fisheye systems to incorporate geometric constraints based on prior knowledge of the pipe components into a Sparse Bundle Adjustment framework. These constraints significantly reduce inaccuracies resulting from the limited spatial resolution of the fisheye imagery, limited image texture, and visual aliasing. Preliminary results are presented for a dataset collected in fiberglass pipe network which demonstrate the validity of the approach.

Notes
Cross listed as CMU-CS-QTR-116

BibTeX

@techreport{Hansen-2013-7666,
author = {Peter Hansen and Hatem Said Alismail and Peter Rander and Brett Browning},
title = {Visual Pipe Mapping with a Fisheye Camera},
year = {2013},
month = {February},
institute = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-13-02},
keywords = {Robotics, computer vision, pipe inspection, LNG, 3D mapping, visual mapping, visual odometry, SLAM, sparse bundle adjustment, structure from motion, fisheye},
}