Approximate inference using conditional entropy decompositions

Amir Globerson*, Tommi Jaakkola

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

8 Scopus citations

Abstract

We introduce a novel method for estimating the partition function and marginals of distributions defined using graphical models. The method uses the entropy chain rule to obtain an upper bound on the entropy of a distribution given marginal distributions of variable subsets. The structure of the bound is determined by a permutation, or elimination order, of the model variables. Optimizing this bound results in an upper bound on the log partition function, and also yields an approximation to the model marginals. The optimization problem is convex, and is in fact a dual of a geometric program. We evaluate the method on a 2D Ising model with a wide range of parameters, and show that it compares favorably with previous methods in terms of both partition function bound, and accuracy of marginals.

Original languageEnglish
Pages (from-to)130-138
Number of pages9
JournalJournal of Machine Learning Research
Volume2
StatePublished - 2007
Externally publishedYes
Event11th International Conference on Artificial Intelligence and Statistics, AISTATS 2007 - San Juan, Puerto Rico
Duration: 21 Mar 200724 Mar 2007

Fingerprint

Dive into the research topics of 'Approximate inference using conditional entropy decompositions'. Together they form a unique fingerprint.

Cite this