Invited talk
in
Workshop: Adaptive and Scalable Nonparametric Methods in Machine Learning
Richard (Fangjian) Guo. Boosting Variational Inference.
Fangjian Guo
Modern Bayesian inference typically requires some form of posterior approximation, and mean-field variational inference (MFVI) is an increasingly popular choice due to its speed. But MFVI is inaccurate in several aspects, including an inability to capture multimodality in the posterior and underestimation of the posterior covariance. These issues arise since MFVI considers approximations to the posterior only in a family of factorized parametric distributions. We instead consider a much more flexible approximating family consisting of all possible mixtures of a parametric base distribution (e.g., Gaussians) without constraining the number of mixture components. In order to efficiently find a high-quality posterior approximation within this family, we borrow ideas from gradient boosting and propose the boosting variational inference (BVI) method, which iteratively improves the current approximation by mixing it with a new component from the base distribution family. We develop practical algorithms for BVI and demonstrate their performance on both real and simulated data. Joint work with Xiangyu Wang, Kai Fan, Tamara Broderick and David Dunson.
Live content is unavailable. Log in and register to view live content