Timezone: »

 
Poster
Belief Propagation Neural Networks
Jonathan Kuck · Shuvam Chakraborty · Hao Tang · Rachel Luo · Jiaming Song · Ashish Sabharwal · Stefano Ermon

Thu Dec 10 09:00 AM -- 11:00 AM (PST) @ Poster Session 5 #1521

Learned neural solvers have successfully been used to solve combinatorial optimization and decision problems. More general counting variants of these problems, however, are still largely solved with hand-crafted solvers. To bridge this gap, we introduce belief propagation neural networks (BPNNs), a class of parameterized operators that operate on factor graphs and generalize Belief Propagation (BP). In its strictest form, a BPNN layer (BPNN-D) is a learned iterative operator that provably maintains many of the desirable properties of BP for any choice of the parameters. Empirically, we show that by training BPNN-D learns to perform the task better than the original BP: it converges 1.7x faster on Ising models while providing tighter bounds. On challenging model counting problems, BPNNs compute estimates 100's of times faster than state-of-the-art handcrafted methods, while returning an estimate of comparable quality.

Author Information

Jonathan Kuck (Stanford)
Shuvam Chakraborty (Stanford University)

I am a student at Stanford University. I am completing by B.S. in Mathematical and Computational Science and will graduate this quarter. I am also completing an M.S. in Statistics and plan to graduate next Winter.

Hao Tang (Shanghai Jiao Tong University)
Rachel Luo (Stanford University)
Jiaming Song (Stanford University)

I am a first year Ph.D. student in Stanford University. I think about problems in machine learning and deep learning under the supervision of Stefano Ermon. I did my undergrad at Tsinghua University, where I was lucky enough to collaborate with Jun Zhu and Lawrence Carin on scalable Bayesian machine learning.

Ashish Sabharwal (Allen Institute for AI)
Stefano Ermon (Stanford)

More from the Same Authors