Efficient Feature Group Sequencing for Anytime Linear Prediction - Robotics Institute Carnegie Mellon University

Efficient Feature Group Sequencing for Anytime Linear Prediction

Hanzhang Hu, Alexander Grubb, J. Andrew (Drew) Bagnell, and Martial Hebert
Conference Paper, Proceedings of 32nd Conference on Uncertainty in Artificial Intelligence (UAI '16), pp. 279 - 288, June, 2016

Abstract

We consider anytime linear prediction in the common machine learning setting, where features are in groups that have costs. We achieve anytime (or interruptible) predictions by sequencing the computation of feature groups and reporting results using the computed features at interruption. We extend Orthogonal Matching Pursuit (OMP) and Forward Regression (FR) to learn the sequencing greedily under this group setting with costs. We theoretically guarantee that our algorithms achieve near-optimal linear predictions at each budget when a feature group is chosen. With a novel analysis of OMP, we improve its theoretical bound to the same strength as that of FR. In addition, we develop a novel algorithm that consumes cost 4B to approximate the optimal performance of any cost B, and prove that with cost less than 4B, such an approximation is impossible. To our knowledge, these are the first anytime bounds at all budgets. We test our algorithms on two real-world data-sets and evaluate them in terms of anytime linear prediction performance against cost-weighted Group Lasso and alternative greedy algorithms.

BibTeX

@conference{Hu-2016-104254,
author = {Hanzhang Hu, Alexander Grubb, J. Andrew (Drew) Bagnell, Martial Hebert},
title = {Efficient Feature Group Sequencing for Anytime Linear Prediction},
booktitle = {Proceedings of 32nd Conference on Uncertainty in Artificial Intelligence (UAI '16)},
year = {2016},
month = {June},
pages = {279 - 288},
keywords = {Anytime prediction, Feature selection, Budgeted prediction},
}