Recognizing Action Units for Facial Expression Analysis - Robotics Institute Carnegie Mellon University

Recognizing Action Units for Facial Expression Analysis

Ying-Li Tian, Takeo Kanade, and Jeffrey Cohn
Tech. Report, CMU-RI-TR-99-40, Robotics Institute, Carnegie Mellon University, December, 1999

Abstract

Most automatic expression analysis systems attempt to recognize a small set of prototypic expressions (e.g. happiness and anger). Such prototypic expressions, however, occur infrequently. Human emotions and intentions are communicated more often by changes in one or two discrete facial features. We develop an automatic system to analyze subtle changes in facial expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal image sequence. Unlike most existing systems, our system attempts to recognize fine?rained changes in facial expression based on Facial Action Coding System (FACS) action units (AUs), instead of six basic expressions (e.g. happiness and anger). Multi?tate face and facial component models are proposed for tracking and modeling different facial features, including lips, eyes, brows, cheeks, and their related wrinkles and facial furrows. Then we convert the results of tracking to detailed parametric descriptions of the facial features. With these features as the inputs, 11 lower face action units (AUs) and 7 upper face AUs are recognized by a neural network algorithm. A recognition rate of 96.7% for lower face AUs and 95% for upper face AUs is obtained respectively. The recognition results indicate that our system can identify action units regardless of whether they occurred singly or in combinations.

BibTeX

@techreport{Tian-1999-15070,
author = {Ying-Li Tian and Takeo Kanade and Jeffrey Cohn},
title = {Recognizing Action Units for Facial Expression Analysis},
year = {1999},
month = {December},
institute = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-99-40},
keywords = {Facial expression analysis, Action units, Neural network},
}