Skip to yearly menu bar Skip to main content


Poster

Analysis of Variational Bayesian Latent Dirichlet Allocation: Weaker Sparsity Than MAP

Shinichi Nakajima · Issei Sato · Masashi Sugiyama · Kazuho Watanabe · Hiroko Kobayashi

Level 2, room 210D

Abstract:

Latent Dirichlet allocation (LDA) is a popular generative model of various objects such as texts and images, where an object is expressed as a mixture of latent topics. In this paper, we theoretically investigate variational Bayesian (VB) learning in LDA. More specifically, we analytically derive the leading term of the VB free energy under an asymptotic setup, and show that there exist transition thresholds in Dirichlet hyperparameters around which the sparsity-inducing behavior drastically changes. Then we further theoretically reveal the notable phenomenon that VB tends to induce weaker sparsity than MAP in the LDA model, which is opposed to other models. We experimentally demonstrate the practical validity of our asymptotic theory on real-world Last.FM music data.

Live content is unavailable. Log in and register to view live content