Timezone: »

 
Poster
Relative Performance Guarantees for Approximate Inference in Latent Dirichlet Allocation
Indraneel Mukherjee · David Blei

Mon Dec 08 08:45 PM -- 12:00 AM (PST) @
Hierarchical probabilistic modeling of discrete data has emerged as a powerful tool for text analysis. Posterior inference in such models is intractable, and practitioners rely on approximate posterior inference methods such as variational inference or Gibbs sampling. There has been much research in designing better approximations, but there is yet little theoretical understanding of which of the available techniques are appropriate, and in which data analysis settings. In this paper we provide the beginnings of such understanding. We analyze the improvement that the recently proposed collapsed variational inference (CVB) provides over mean field variational inference (VB) in latent Dirichlet allocation. We prove that the difference in the tightness of the bound on the likelihood of a document decreases as $O(k-1) + \log m /m$, where $k$ is the number of topics in the model and $m$ is the number of words in a document. As a consequence, the advantage of CVB over VB is lost for long documents but increases with the number of topics. We demonstrate empirically that the theory holds, using simulated text data and two text corpora. We provide practical guidelines for choosing an approximation.

Author Information

Indraneel Mukherjee (Princeton University)
David Blei (Columbia University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors