Skip to yearly menu bar Skip to main content


Poster

Approximate maximum entropy principles via Goemans-Williamson with applications to provable variational methods

Andrej Risteski · Yuanzhi Li

Area 5+6+7+8 #72

Keywords: [ Graphical Models ] [ Variational Inference ] [ Combinatorial Optimization ] [ Learning Theory ]


Abstract:

The well known maximum-entropy principle due to Jaynes, which states that given mean parameters, the maximum entropy distribution matching them is in an exponential family has been very popular in machine learning due to its “Occam’s razor” interpretation. Unfortunately, calculating the potentials in the maximum entropy distribution is intractable [BGS14]. We provide computationally efficient versions of this principle when the mean parameters are pairwise moments: we design distributions that approximately match given pairwise moments, while having entropy which is comparable to the maximum entropy distribution matching those moments. We additionally provide surprising applications of the approximate maximum entropy principle to designing provable variational methods for partition function calculations for Ising models without any assumptions on the potentials of the model. More precisely, we show that we can get approximation guarantees for the log-partition function comparable to those in the low-temperature limit, which is the setting of optimization of quadratic forms over the hypercube. ([AN06])

Live content is unavailable. Log in and register to view live content