Real-Time Face and Facial Feature Tracking and Applications - Robotics Institute Carnegie Mellon University

Real-Time Face and Facial Feature Tracking and Applications

Jie Yang, Rainer Stiefelhagen, Uwe Meier, and Alex Waibel
Conference Paper, Proceedings of Auditory-Visual Speech Processing (AVSP '98), pp. 79 - 84, December, 1998

Abstract

A human face provides a variety of different communicative functions. In this paper, we present approaches for real-time face/facial feature tracking and their applications. First, we present techniques of tracking human faces. It is revealed that human skin-color can be used as a major feature for tracking human faces. An adaptive stochastic model has been developed to characterize the skin-color distributions. Based on the maximum likelihood method, the model parameters can be adapted for different people and different lighting conditions. The feasibility of the model has been demonstrated by the development of a real-time face tracker. We then present a top-down approach for tracking facial features such as eyes, nostrils, and lip corners. These real-time tracking techniques have been successfully applied to many applications such as eye-gaze monitoring, head pose tracking, and lip-reading.

BibTeX

@conference{Yang-1998-16598,
author = {Jie Yang and Rainer Stiefelhagen and Uwe Meier and Alex Waibel},
title = {Real-Time Face and Facial Feature Tracking and Applications},
booktitle = {Proceedings of Auditory-Visual Speech Processing (AVSP '98)},
year = {1998},
month = {December},
pages = {79 - 84},
}