Timezone: »

 
Poster
NAS-X: Neural Adaptive Smoothing via Twisting
Dieterich Lawson · Michael Li · Scott Linderman

Tue Dec 12 03:15 PM -- 05:15 PM (PST) @ Great Hall & Hall B1+B2 #1219

Sequential latent variable models (SLVMs) are essential tools in statistics and machine learning, with applications ranging from healthcare to neuroscience. As their flexibility increases, analytic inference and model learning can become challenging, necessitating approximate methods. Here we introduce neural adaptive smoothing via twisting (NAS-X), a method that extends reweighted wake-sleep (RWS) to the sequential setting by using smoothing sequential Monte Carlo (SMC) to estimate intractable posterior expectations. Combining RWS and smoothing SMC allows NAS-X to provide low-bias and low-variance gradient estimates, and fit both discrete and continuous latent variable models. We illustrate the theoretical advantages of NAS-X over previous methods and explore these advantages empirically in a variety of tasks, including a challenging application to mechanistic models of neuronal dynamics. These experiments show that NAS-X substantially outperforms previous VI- and RWS-based methods in inference and model learning, achieving lower parameter error and tighter likelihood bounds.

Author Information

Dieterich Lawson (Google Research)
Michael Li (Stanford University)
Scott Linderman (Stanford University)

More from the Same Authors