Model-Based Tracking of Self-Occluding Articulated Objects

Jim Rehg and Takeo Kanade
Proceedings of the Fifth International Conference on Computer Vision (ICCV '95), July, 1995, pp. 612-617.


Download
  • Adobe portable document format (pdf) (253KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
Computer sensing of hand and limb motion is an important problem for applications in human computer interaction and computer graphics. We describe a framework for local trading of self occluding motion, in which one part of an object obstructs the visibility of another. Our approach uses a kinematic model to predict occlusions and windowed templates to track partially occluded objects. We present offline 3D tracking results for hand motion with significant self occlusion.

Notes
Associated Center(s) / Consortia: Vision and Autonomous Systems Center

Text Reference
Jim Rehg and Takeo Kanade, "Model-Based Tracking of Self-Occluding Articulated Objects," Proceedings of the Fifth International Conference on Computer Vision (ICCV '95), July, 1995, pp. 612-617.

BibTeX Reference
@inproceedings{Rehg_1995_1762,
   author = "Jim Rehg and Takeo Kanade",
   title = "Model-Based Tracking of Self-Occluding Articulated Objects",
   booktitle = "Proceedings of the Fifth International Conference on Computer Vision (ICCV '95)",
   pages = "612-617",
   month = "July",
   year = "1995",
}