EigenFiltering for Flexible Eigentracking - Robotics Institute Carnegie Mellon University

EigenFiltering for Flexible Eigentracking

F. De la Torre, J. Vitria, P. Radeva, and J. Melenchon
Conference Paper, Proceedings of 15th International Conference on Pattern Recognition (ICPR '00), Vol. 3, pp. 1106 - 1109, September, 2000

Abstract

Traditional techniques for tracking nonrigid objects such as optical flow, correlation, active contours or color, cannot deal with situations where image changes are not due to motion but appearance (e.g. tracking the lips when the teeth appear). Two main contributions for appearance tracking of flexible objects are proposed. The first one is a flexible generalization of eigentracking within the same robust continuous optimization framework. The second one is a generalization of traditional graylevel eigenspaces, constructing a multiple channel "eigenspace" using filter responses to give robustness against variations in the training conditions such as illumination changes. Additionally, 3D geometric transformations are incorporated, a regularization term is added for numerical stability reasons and the optimization problem is solved in closed form. Experiments on lip tracking are reported.

BibTeX

@conference{De-2000-120966,
author = {F. De la Torre and J. Vitria and P. Radeva and J. Melenchon},
title = {EigenFiltering for Flexible Eigentracking},
booktitle = {Proceedings of 15th International Conference on Pattern Recognition (ICPR '00)},
year = {2000},
month = {September},
volume = {3},
pages = {1106 - 1109},
}