Carnegie Mellon University
The
RoGuE: Robot Gesture Engine

Rachel Holladay and Siddhartha Srinivasa
AAAI Spring Symposium Series, March, 2016.


Download
  • Adobe portable document format (pdf) (2MB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
We present the Robot Gesture Library (RoGuE), a motion-planning approach to generating gestures. Gestures improve robot communication skills, strengthening robots as partners in a collaborative setting. Previous work maps from environment scenario to gesture selection. This work maps from gesture selection to gesture execution. We create a flexible and common language by parameterizing gestures as task-space constraints on robot trajectories and goals. This allows us to leverage powerful motion planners and to generalize across environments and robot morphologies. We demonstrate RoGuE on four robots: HREB, ADA, CURI and the PR2.

Keywords
robot, gesture, human robot interaction, task space region, constraints

Notes
Associated Center(s) / Consortia: Quality of Life Technology Center, National Robotics Engineering Center, and Center for the Foundations of Robotics
Associated Lab(s) / Group(s): Personal Robotics
Number of pages: 8

Text Reference
Rachel Holladay and Siddhartha Srinivasa, "RoGuE: Robot Gesture Engine," AAAI Spring Symposium Series, March, 2016.

BibTeX Reference
@inproceedings{Holladay__2016_8048,
   author = "Rachel {Holladay } and Siddhartha Srinivasa",
   title = "RoGuE: Robot Gesture Engine",
   booktitle = "AAAI Spring Symposium Series",
   month = "March",
   year = "2016",
}