Advanced Search   
  Look in
       Title    Full-text
  Date Range
      
      

VASC Seminar: Alex Grubb
Anytime Structured Prediction

Alex Grubb
Ph.D. Candidate, CMU

December 03, 2012, 3pm - 4pm, NSH 1305
Abstract

In many prediction problems, finding the right balance between accuracy and computational cost is critical. Specifically, in computationally constrained settings, an ideal algorithm utilizes as much of the available resources as possible to provide the most accurate result. Selecting such an algorithm can be difficult, however, when the computational constraints are not fixed in advance. For example, many applications require predictions in time to allow for adaptive behaviors which respond to real-time events. Such constraints often rely on a number of factors at prediction time, making it difficult to select a fixed algorithm a priori. Our work addresses this deficit by studying algorithms which can automatically scale to fit any computational constraint. Our work produces anytime prediction algorithms which rapidly produce initial predictions and then continue to refine the results as time allows, producing final results which dynamically improve to fit any computational budget.

I will present our framework for learning and analyzing anytime predictors which builds off the traditional boosting approach to machine learning. These modified boosting algorithms select weak predictors based not only on their performance, but also their computational cost, allowing the learning algorithm to automatically trade-off computational cost and accuracy. I will also show how our work can be applied to typical prediction problems, such as object detection and classification, as well as more complex structured prediction problems, specifically the scene understanding domain. In this domain I will demonstrate our ongoing work of applying anytime prediction algorithms to a hierarchical scene understanding approach to select both the most computationally efficient structural representations of the problem to utilize and the computations to run using these representations.


Additional Information

Host: Internal

Speaker Biography

Alexander Grubb is a PhD candidate at Carnegie Mellon University, where he also received his BS in 2007. His research interests include machine learning, computer vision, and robotics. His current work focuses on ensemble methods for machine learning, specifically extending function space optimization, or gradient boosting, to a variety of domains.