Fast Inference and Learning in Large-State-Space HMMs - Robotics Institute Carnegie Mellon University

Fast Inference and Learning in Large-State-Space HMMs

Sajid Siddiqi and Andrew Moore
Conference Paper, Proceedings of (ICML) International Conference on Machine Learning, pp. 800 - 807, August, 2005

Abstract

For Hidden Markov Models (HMMs) with fully connected transition models, the three fundamental problems of evaluating the likelihood of an observation sequence, estimating an optimal state sequence for the observations, and learning the model parameters, all have quadratic time complexity in the number of states. We introduce a novel class of non-sparse Markov transition matrices called Dense-Mostly-Constant (DMC) transition matrices that allow us to derive new algorithms for solving the basic HMM problems in sub-quadratic time. We describe the DMC HMM model and algorithms and attempt to convey some intuition for their usage. Empirical results for these algorithms show dramatic speedups for all three problems. In terms of accuracy, the DMC model yields strong results and outperforms the baseline algorithms even in domains known to violate the DMC assumption.

BibTeX

@conference{Siddiqi-2005-9248,
author = {Sajid Siddiqi and Andrew Moore},
title = {Fast Inference and Learning in Large-State-Space HMMs},
booktitle = {Proceedings of (ICML) International Conference on Machine Learning},
year = {2005},
month = {August},
pages = {800 - 807},
}