Advanced Search   
  Look in
       Title     Description
  Include
       Inactive Projects
 
 
Affect Analysis Group
This lab is no longer active.
Head: Jeffrey Cohn
Lab Homepage
Overview
The face is a rich source of information about human behavior. Facial displays indicate emotion, pain, brain function and pathology, and regulate social behavior. Manual methods of coding facial behavior are labor intensive, semi-quantitative, and difficult to standardize across laboratories or over time. With few exceptions, current approaches to automated analysis focus on a small set of prototypic expressions (e.g., anger or joy), which facilitates analysis. In daily life, prototypic expressions occur relatively infrequently, and emotion more often is communicated by change in one or two discrete features, such as tightening the lips in anger. To capture the subtlety of human emotion and non-verbal communication, our interdisciplinary team of computer scientists and psychologists developed the first version of Automated Face Analysis. Automated Face Analysis quantifies subtle changes in facial motion and demonstrates concurrent validity with human observers using the Facial Action Coding System. Continuing system development is part of a larger goal of developing computer systems that can detect human activity, recognize the people involved, understand their behavior, and respond appropriately.