/Bridging Text Spotting and SLAM with Junction Features

Bridging Text Spotting and SLAM with Junction Features

Hsueh-Cheng Wang, Chelsea Finn, Liam Paull, Michael Kaess, Ruth Rosenholtz, Seth Teller and John Leonard
Conference Paper, In IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, IROS, pp. 3701-3708, September, 2015

Download Publication (PDF)

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

Navigating in a previously unknown environment and recognizing naturally occurring text in a scene are two important autonomous capabilities that are typically treated as distinct. However, these two tasks are potentially complementary, (i) scene and pose priors can benefit text spotting, and (ii) the ability to identify and associate text features can benefit navigation accuracy through loop closures. Previous approaches to autonomous text spotting typically require significant training data and are too slow for real-time implementation. In this work, we propose a novel high-level feature descriptor, the “junction”, which is particularly well-suited to text representation and is also fast to compute. We show that we are able to improve SLAM through text spotting on datasets collected with a Google Tango, illustrating how location priors enable improved loop closure with text features.

BibTeX Reference
@conference{Wang-2015-6030,
author = {Hsueh-Cheng Wang and Chelsea Finn and Liam Paull and Michael Kaess and Ruth Rosenholtz and Seth Teller and John Leonard},
title = {Bridging Text Spotting and SLAM with Junction Features},
booktitle = {In IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, IROS},
year = {2015},
month = {September},
pages = {3701-3708},
}
2017-09-13T10:38:34+00:00