An Architecture for Gesture Based Control of Mobile Robots - Robotics Institute Carnegie Mellon University

An Architecture for Gesture Based Control of Mobile Robots

Soshi Iba, J. Michael Vandeweghe, Chris Paredis, and Pradeep Khosla
Conference Paper, Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 2, pp. 851 - 857, October, 1999

Abstract

Gestures provide a rich and intuitive form of interaction for controlling robots. This paper presents an approach for controlling a mobile robot with hand gestures. The system uses hidden Markov models (HMMs) to spot and recognize gestures captured with a data glove. To spot gestures from a sequence of hand positions that may include nongestures, we have introduced a "wait state" in the HMM. The system is currently capable of spotting six gestures reliably. These gestures are mapped to robot commands under two different modes of operation: local and global control. In the local control mode, the gestures are interpreted in the robot's local frame of reference, allowing the user to accelerate, decelerate, and turn. In the global control mode, the gestures are interpreted in the world frame, allowing the robot to move to the location at which the user is pointing.

BibTeX

@conference{Iba-1999-15033,
author = {Soshi Iba and J. Michael Vandeweghe and Chris Paredis and Pradeep Khosla},
title = {An Architecture for Gesture Based Control of Mobile Robots},
booktitle = {Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {1999},
month = {October},
volume = {2},
pages = {851 - 857},
}