Online Environment Reconstruction for Biped Navigation - Robotics Institute Carnegie Mellon University

Online Environment Reconstruction for Biped Navigation

Philipp Michel, Joel Chestnutt, Satoshi Kagami, Koichi Nishiwaki, James Kuffner, and Takeo Kanade
Conference Paper, Proceedings of (ICRA) International Conference on Robotics and Automation, pp. 3089 - 3094, May, 2006

Abstract

As navigation autonomy becomes an increasingly important research topic for biped humanoid robots, efficient approaches to perception and mapping that are suited to the unique characteristics of humanoids and their typical operating environments will be required. This paper presents a system for online environment reconstruction that utilizes both external sensors for global localization, and on-body sensors for detailed local mapping. An external optical motion capture system is used to accurately localize on-board sensors that integrate successive 2D views of a calibrated camera and range measurements from a SwissRanger SR-2 time-of-flight sensor to construct global environment maps in real-time. Environment obstacle geometry is encoded in 2D occupancy grids and 2.5D height maps for navigation planning. We present an on-body implementation for the HRP-2 humanoid robot that, combined with a footstep planner, enables the robot to autonomously traverse dynamic environments containing unpredictably moving obstacles.

BibTeX

@conference{Michel-2006-9446,
author = {Philipp Michel and Joel Chestnutt and Satoshi Kagami and Koichi Nishiwaki and James Kuffner and Takeo Kanade},
title = {Online Environment Reconstruction for Biped Navigation},
booktitle = {Proceedings of (ICRA) International Conference on Robotics and Automation},
year = {2006},
month = {May},
pages = {3089 - 3094},
}