A Probabilistic Framework for Rigid and Non-Rigid Appearance based Tracking and Recognition - Robotics Institute Carnegie Mellon University

A Probabilistic Framework for Rigid and Non-Rigid Appearance based Tracking and Recognition

F. De la Torre, Y. Yacoob, and L. Davis
Conference Paper, Proceedings of 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG '00), pp. 491 - 498, March, 2000

Abstract

This paper describes an unified probabilistic framework for appearance-based tracking of rigid and non-rigid objects. A spatio-temporal dependent shape-texture eigenspace and mixture of diagonal Gaussians are learned in a hidden Markov model (HMM)-like structure to better constrain the model and for recognition purposes. Particle filtering is used to track the object while switching between different shape/texture models. This framework allows recognition and temporal segmentation of activities. Additionally an automatic stochastic initialization is proposed, the number of states in the HMM are selected based on the Akaike information criterion and comparison with deterministic tracking for 2D models is discussed. Preliminary results of eye tracking, lip tracking and temporal segmentation of mouth events are presented.

BibTeX

@conference{De-2000-120965,
author = {F. De la Torre and Y. Yacoob and L. Davis},
title = {A Probabilistic Framework for Rigid and Non-Rigid Appearance based Tracking and Recognition},
booktitle = {Proceedings of 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG '00)},
year = {2000},
month = {March},
pages = {491 - 498},
}