Timezone: »

Sampling from Probabilistic Submodular Models
Alkis Gotovos · Hamed Hassani · Andreas Krause

Tue Dec 08 04:00 PM -- 08:59 PM (PST) @ 210 C #70

Submodular and supermodular functions have found wide applicability in machine learning, capturing notions such as diversity and regularity, respectively. These notions have deep consequences for optimization, and the problem of (approximately) optimizing submodular functions has received much attention. However, beyond optimization, these notions allow specifying expressive probabilistic models that can be used to quantify predictive uncertainty via marginal inference. Prominent, well-studied special cases include Ising models and determinantal point processes, but the general class of log-submodular and log-supermodular models is much richer and little studied. In this paper, we investigate the use of Markov chain Monte Carlo sampling to perform approximate inference in general log-submodular and log-supermodular models. In particular, we consider a simple Gibbs sampling procedure, and establish two sufficient conditions, the first guaranteeing polynomial-time, and the second fast (O(nlogn)) mixing. We also evaluate the efficiency of the Gibbs sampler on three examples of such models, and compare against a recently proposed variational approach.

Author Information

Alkis Gotovos (ETH Zurich)
Hamed Hassani (ETH Zurich)
Andreas Krause (ETHZ)

More from the Same Authors