Efficient Locally Weighted Polynomial Regression Predictions - Robotics Institute Carnegie Mellon University

Efficient Locally Weighted Polynomial Regression Predictions

Andrew Moore, Jeff Schneider, and Kan Deng
Conference Paper, Proceedings of (ICML) International Conference on Machine Learning, pp. 236 - 244, July, 1997

Abstract

Locally weighted polynomial regression (LWPR) is a popular instance-based algorithm for learning continuous non-linear mappings. For more than two or three inputs and for more than a few thousand datapoints the computational expense of predictions is daunting. We discuss drawbacks with previous approaches to dealing with this problem, and present a new algorithm based on a multiresolution search of a quickly-constructible augmented kd-tree. Without needing to rebuild the tree, we can make fast predictions with arbitrary local weighting functions, arbitrary kernel widths and arbitrary queries. The paper begins with a new, faster, algorithm for exact LWPR predictions. Next we introduce an approximation that achieves up to a two-orders-of-magnitude speedup with negligible accuracy losses. Increasing a certain approximation parameter achieves greater speedups still, but with a correspondingly larger accuracy degradation. This is nevertheless useful during operations such as the early stages of model selection and locating optima of a fitted surface. We also show how the approximations can permit real-time query-specific optimization of the kernel width. We conclude with a brief discussion of potential extensions for tractable instance-based learning on datasets that are too large to fit in a computer's main memory.

BibTeX

@conference{Moore-1997-16400,
author = {Andrew Moore and Jeff Schneider and Kan Deng},
title = {Efficient Locally Weighted Polynomial Regression Predictions},
booktitle = {Proceedings of (ICML) International Conference on Machine Learning},
year = {1997},
month = {July},
pages = {236 - 244},
}