DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction - Robotics Institute Carnegie Mellon University

DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction

Jim Rehg and Takeo Kanade
Workshop Paper, IEEE Workshop on Motion of Non-Rigid and Articulated Objects, pp. 16 - 22, November, 1994

Abstract

Computer sensing of hand and limb motion is an important problem for applications in human-computer interaction (HCI), virtual reality, and athletic performance measurement. Commercially available sensors are invasive, and require the user to wear gloves or targets. We have developed a noninvasive vision-based hand tracking system, called DigitEyes. Employing a kinematic hand model, the DigitEyes system has demonstrated tracking performance at speeds of up to 10 Hz, using line and point features extracted from gray scale images of unadorned, unmarked hands. We describe an application of our sensor to a 3D mouse user-interface problem.

BibTeX

@workshop{Rehg-1994-13790,
author = {Jim Rehg and Takeo Kanade},
title = {DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction},
booktitle = {Proceedings of IEEE Workshop on Motion of Non-Rigid and Articulated Objects},
year = {1994},
month = {November},
pages = {16 - 22},
}