Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition - Robotics Institute Carnegie Mellon University

Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition

K. Ogaki, Kris M. Kitani, Y. Sugano, and Y. Sato
Workshop Paper, CVPR '12 Workshop on Egocentric Vision, June, 2012

Abstract

We focus on the use of first-person eye movement and ego-motion as a means of understanding and recognizing indoor activities from an “inside-out” camera system. We show that when eye movement captured by an inside looking camera is used in tandem with ego-motion features extracted from an outside looking camera, the classification accuracy of first-person actions can be improved. We also present a dataset of over two hours of realistic indoor desktop actions, including both eye tracking information and a high quality outside camera video. We run experiments and show that our joint feature is effective and robust over multiple users.

BibTeX

@workshop{Ogaki-2012-109821,
author = {K. Ogaki and Kris M. Kitani and Y. Sugano and Y. Sato},
title = {Coupling Eye-Motion and Ego-Motion Features for First-Person Activity Recognition},
booktitle = {Proceedings of CVPR '12 Workshop on Egocentric Vision},
year = {2012},
month = {June},
}