Augmented Reality for Robot Development and Experimentation

Michael Stilman, Philipp Michel, Joel Chestnutt, Koichi Nishiwaki, Satoshi Kagami, and James Kuffner
tech. report CMU-RI-TR-05-55, Robotics Institute, Carnegie Mellon University, December, 2005


Download
  • Adobe portable document format (pdf) (7MB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
The successful development of autonomous robotic systems requires careful fusion of complex subsystems for perception, planning, and control. Often these subsystems are designed in a modular fashion and tested individually. However, when ultimately combined with other components to form a complete system, unexpected interactions between subsystems can occur that make it difficult to isolate the source of problems. This paper presents a novel paradigm for robot experimentation that enables unified testing of individual subsystems while acting as part of a complete whole made up of both virtual and real components. We exploit the recent advances in speed and accuracy of optical motion capture to localize the robot, track environment objects, and extract extrinsic parameters for moving cameras in real-time. We construct a world model representation that serves as ground truth for both visual and tactile sensors in the environment. From this data, we build spatial and temporal correspondences between virtual elements, such as motion plans, and real artifacts in the scene. The system enables safe, decoupled testing of component algorithms for vision, motion planning and control that would normally have to be tested simultaneously on actual hardware. We show results of successful online applications in the development of an autonomous humanoid robot.

Keywords
augmented reality, robot experimentation, motion capture, real time motion capture, robot control, robot planning, visualization

Notes
Associated Center(s) / Consortia: Vision and Autonomous Systems Center and Center for the Foundations of Robotics
Associated Lab(s) / Group(s): Planning and Autonomy Lab
Associated Project(s): Perception for Humanoid Robots
Number of pages: 11

Text Reference
Michael Stilman, Philipp Michel, Joel Chestnutt, Koichi Nishiwaki, Satoshi Kagami, and James Kuffner, "Augmented Reality for Robot Development and Experimentation," tech. report CMU-RI-TR-05-55, Robotics Institute, Carnegie Mellon University, December, 2005

BibTeX Reference
@techreport{Stilman_2005_5227,
   author = "Michael Stilman and Philipp Michel and Joel Chestnutt and Koichi Nishiwaki and Satoshi Kagami and James Kuffner",
   title = "Augmented Reality for Robot Development and Experimentation",
   booktitle = "",
   institution = "Robotics Institute",
   month = "December",
   year = "2005",
   number= "CMU-RI-TR-05-55",
   address= "Pittsburgh, PA",
}