doctoral dissertation, tech. report CMU-RI-TR-11-28, Robotics Institute, Carnegie Mellon University, July, 2011
|Probabilistic reasoning and learning with permutation data arises as a fundamental problem in myriad applications such as modeling preference rankings over objects (such as webpages), tracking multiple moving objects, reconstructing the temporal ordering of events from multiple imperfect accounts, and more. Since the number of permutations scales factorially with the number of objects being ranked or tracked, however, it is not feasible to represent and reason with arbitrary probability distributions on permutations. Consequently, many approaches to probabilistic reasoning problems on the group of permutations have been either ad-hoc, unscalable, and/or relied on rigid and unrealistic assumptions. For example, common factorized probability distribution representations, such as graphical models, are inefﬁcient due to the mutual exclusivity constraints that are typically associated with permutations.
This thesis addresses problems of scalability for probabilistic reasoning with permutations by exploiting a number of methods for decomposing complex distributions over permutations into simpler, smaller component parts. In particular, we explore two general and complementary approaches for decomposing distributions over permutations: (1) additive decompositions and (2) multiplicative decompositions. Our additive approach is based on the idea of projecting a distribution onto a group theoretic generalization of the Fourier basis. Our multiplicative approach assumes a factored form for the underlying probability distribution based on a generalization of the notion of probabilistic independence which we call rifﬂed independence.
We show that both probabilistic decompositions lead to compact representations for istributions over permutations and that one can formulate efﬁcient probabilistic inference algorithms by taking advantage of the combinatorial structure of each representation. An underlying theme throughout is the idea that both kinds of structural decompositions can be employed in tandem to relax the apparent intractability of probabilistic reasoning over the space of permutations.
From the theoretical side, we address a number of problems in understanding the consequences of our approximations. For example, we present results in this thesis which illuminate the nature of error propagation in the Fourier domain and propose methods for mitigating their effects.
Finally, we apply our decompositions to multiple application domains. For example, we show how the methods in the thesis can be used to solve challenging camera tracking scenarios as well as to reveal latent voting patterns and structure in Irish political elections and food preference surveys.
To summarize, the main contributions of this thesis can be categorized into the following three broad categories:
• Principled and compactly representable approximations of probability distributions over the group of permutations,
• Flexible probabilistic reasoning and learning algorithms which ex- ploit the structure of these compact representations for running time efﬁciency, and
• Theoretical analyses of approximation quality as well as algorithmic and sample complexity.
Number of pages: 303
|Jonathan Huang, "Probabilistic Reasoning and Learning on Permutations: Exploiting Structural Decompositions of the Symmetric Group ," doctoral dissertation, tech. report CMU-RI-TR-11-28, Robotics Institute, Carnegie Mellon University, July, 2011|
author = "Jonathan Huang",
title = "Probabilistic Reasoning and Learning on Permutations: Exploiting Structural Decompositions of the Symmetric Group ",
booktitle = "",
school = "Robotics Institute, Carnegie Mellon University",
month = "July",
year = "2011",
address= "Pittsburgh, PA",
|The Robotics Institute is part of the School of Computer Science, Carnegie Mellon University.|
Contact Us | Update Instructions