Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation - The Robotics Institute Carnegie Mellon University
Home/Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation

Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation

Simon Baker, Iain Matthews, Jing Xiao, Ralph Gross, Takahiro Ishikawa and Takeo Kanade
Tech. Report, CMU-RI-TR-04-10, Robotics Institute, Carnegie Mellon University, February, 2004
View Publication

Abstract

The non-rigid motion of a driver's head (i.e. the motion of their mouth, eye-brows, cheeks, etc) can tell us a lot about their mental state; e.g. whether they are drowsy, alert, aggressive, comfortable, tense, distracted, etc. In this paper, we describe our recent research on non-rigid face tracking. In particular, we present both 2D and 3D algorithms for tracking the non-rigid motion of the driver's head using an Active Appearance Model. Both algorithms operate at over 200 frames per second. We also present algorithms for converting a 2D model into a 3D model and for fitting with occlusion and large pose variation.

BibTeX

@techreport{Baker-2004-8852,
author = {Simon Baker and Iain Matthews and Jing Xiao and Ralph Gross and Takahiro Ishikawa and Takeo Kanade},
title = {Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation},
year = {2004},
month = {February},
institution = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-04-10},
keywords = {Active Appearance Models, Real-Time Driver Non-Rigid Head Tracking},
}
2020-08-11T16:03:48-04:00

Share This Story!