InvisibleRobot: Facilitating Robot Manipulation Through Diminished Reality - Robotics Institute Carnegie Mellon University

InvisibleRobot: Facilitating Robot Manipulation Through Diminished Reality

Conference Paper, Proceedings of IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct '19), pp. 165 - 166, October, 2019

Abstract

When controlling robots, users often face the issue of an operating area that is occluded by an element in the environment or the robot's body. To gain an unobstructed view of the scene, users have to either adjust the pose of the robot or their own viewpoint. This presents a problem, especially for users who rely on assistive robots as they can't easily change their point of view. We introduce InvisibleRobot, a diminished reality-based approach that overlays background information onto the robot in the user's view through an Optical See-Through Head-Mounted Display. We consider two visualization modes for InvisibleRobot: removing the robot body from the user's view entirely, or removing the interior of the robot while maintaining its outline. In a preliminary user study, we compare InvisibleRobot with traditional robot manipulation under different occlusion conditions. Our results suggest that InvisibleRobot can support manipulation in occluded conditions and could be an efficient method to simplify control in assistive robotics.

BibTeX

@conference{Plopski-2019-122469,
author = {Alexander Plopski and Ada Virginia Taylor and Elizabeth Jeanne Carter and Henny Admoni},
title = {InvisibleRobot: Facilitating Robot Manipulation Through Diminished Reality},
booktitle = {Proceedings of IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct '19)},
year = {2019},
month = {October},
pages = {165 - 166},
}