Computing Conditional Probabilities in Large Domains by Maximizing Renyi's Quadratic Entropy

Charles Zitnick
doctoral dissertation, tech. report CMU-RI-TR-03-20, Robotics Institute, Carnegie Mellon University, May, 2003


Download
  • Adobe portable document format (pdf) (3MB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
In this dissertation we discuss methods for efficiently approximating conditional probabilities in large domains by maximizing the entropy of the distribution given a set of constraints. The constraints are constructed from conditional probabilities, typically of low-order, that can be accurately computed from the training data. By appropriately choosing the constraints, maximum entropy methods can balance the tradeoffs in errors due to bias and variance.

Standard maximum entropy techniques are too computationally inefficient for use in large domains in which the set of variables that are being conditioned upon varies. Instead of using the standard measure of entropy first proposed by Shannon, we use a measure that lies within the family of R\'enyi's entropies. If we allow our probability estimates to occasionally lie outside the range from 0 to 1, we can efficiently maximize R\'enyi's quadratic entropy relative to the constraints using a set of linear equations.

We develop two algorithms, the inverse probability method and recurrent linear network, for maximizing R\'enyi's quadratic entropy without bounds. The algorithms produce identical results. However, depending on the type of problem, one method may be more computationally efficient than the other. We also propose an extension to the algorithms for partially enforcing the constraints based on our confidence in them.

Our algorithms are tested on several applications including: collaborative filtering, image retrieval and language modeling.


Keywords
maximum entropy, collaborative filtering, image retrieval, language modeling, conditional probabilities

Notes
Associated Center(s) / Consortia: Vision and Autonomous Systems Center

Text Reference
Charles Zitnick, "Computing Conditional Probabilities in Large Domains by Maximizing Renyi's Quadratic Entropy," doctoral dissertation, tech. report CMU-RI-TR-03-20, Robotics Institute, Carnegie Mellon University, May, 2003

BibTeX Reference
@phdthesis{Zitnick_2003_4408,
   author = "Charles Zitnick",
   title = "Computing Conditional Probabilities in Large Domains by Maximizing Renyi's Quadratic Entropy",
   booktitle = "",
   school = "Robotics Institute, Carnegie Mellon University",
   month = "May",
   year = "2003",
   number= "CMU-RI-TR-03-20",
   address= "Pittsburgh, PA",
}