/Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation

Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation

Simon Baker, Iain Matthews, Jing Xiao, Ralph Gross, Takahiro Ishikawa and Takeo Kanade
Tech. Report, CMU-RI-TR-04-10, Robotics Institute, Carnegie Mellon University, February, 2004

Download Publication (PDF)

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

The non-rigid motion of a driver’s head (i.e. the motion of their mouth, eye-brows, cheeks, etc) can tell us a lot about their mental state; e.g. whether they are drowsy, alert, aggressive, comfortable, tense, distracted, etc. In this paper, we describe our recent research on non-rigid face tracking. In particular, we present both 2D and 3D algorithms for tracking the non-rigid motion of the driver’s head using an Active Appearance Model. Both algorithms operate at over 200 frames per second. We also present algorithms for converting a 2D model into a 3D model and for fitting with occlusion and large pose variation.

BibTeX Reference
@techreport{Baker-2004-8852,
author = {Simon Baker and Iain Matthews and Jing Xiao and Ralph Gross and Takahiro Ishikawa and Takeo Kanade},
title = {Real-Time Non-Rigid Driver Head Tracking for Driver Mental State Estimation},
year = {2004},
month = {February},
institution = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-04-10},
keywords = {Active Appearance Models, Real-Time Driver Non-Rigid Head Tracking},
}
2017-09-13T10:44:11+00:00