DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction

Jim Rehg and Takeo Kanade
IEEE Workshop on Motion of Non-Rigid and Articulated Objects, November, 1994, pp. 16-22.


Download
  • Adobe portable document format (pdf) (233KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
Computer sensing of hand and limb motion is an important problem for applications in human-computer interaction (HCI), virtual reality, and athletic performance measurement. Commercially available sensors are invasive, and require the user to wear gloves or targets. We have developed a noninvasive vision-based hand tracking system, called DigitEyes. Employing a kinematic hand model, the DigitEyes system has demonstrated tracking performance at speeds of up to 10 Hz, using line and point features extracted from gray scale images of unadorned, unmarked hands. We describe an application of our sensor to a 3D mouse user-interface problem.

Notes

Text Reference
Jim Rehg and Takeo Kanade, "DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction," IEEE Workshop on Motion of Non-Rigid and Articulated Objects, November, 1994, pp. 16-22.

BibTeX Reference
@inproceedings{Rehg_1994_1732,
   author = "Jim Rehg and Takeo Kanade",
   title = "DigitEyes: Vision-Based Hand Tracking for Human-Computer Interaction",
   booktitle = "IEEE Workshop on Motion of Non-Rigid and Articulated Objects",
   pages = "16-22",
   month = "November",
   year = "1994",
}