Skip to yearly menu bar Skip to main content


Poster

Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization

Abhinav Agrawal · Daniel Sheldon · Justin Domke

Poster Session 5 #1478

Keywords: [ Algorithms ] [ Regression ] [ Non-Convex Optimization ] [ Algorithms -> Large Scale Learning; Optimization ]


Abstract:

Recent research has seen several advances relevant to black-box VI, but the current state of automatic posterior inference is unclear. One such advance is the use of normalizing flows to define flexible posterior densities for deep latent variable models. Another direction is the integration of Monte-Carlo methods to serve two purposes; first, to obtain tighter variational objectives for optimization, and second, to define enriched variational families through sampling. However, both flows and variational Monte-Carlo methods remain relatively unexplored for black-box VI. Moreover, on a pragmatic front, there are several optimization considerations like step-size scheme, parameter initialization, and choice of gradient estimators, for which there are no clear guidance in the existing literature. In this paper, we postulate that black-box VI is best addressed through a careful combination of numerous algorithmic components. We evaluate components relating to optimization, flows, and Monte-Carlo methods on a benchmark of 30 models from the Stan model library. The combination of these algorithmic components significantly advances the state-of-the-art "out of the box" variational inference.

Chat is not available.