Skip to yearly menu bar Skip to main content

Oral Poster

Exact Bayesian Inference on Discrete Models via Probability Generating Functions: A Probabilistic Programming Approach

Fabian Zaiser · Andrzej Murawski · Chih-Hao Luke Ong

Great Hall & Hall B1+B2 (level 1) #1209
[ ] [ Project Page ]
[ Paper [ Poster [ OpenReview
Tue 12 Dec 8:45 a.m. PST — 10:45 a.m. PST
Oral presentation: Oral 1C Tractable models
Tue 12 Dec 8 a.m. PST — 8:45 a.m. PST


We present an exact Bayesian inference method for discrete statistical models, which can find exact solutions to a large class of discrete inference problems, even with infinite support and continuous priors.To express such models, we introduce a probabilistic programming language that supports discrete and continuous sampling, discrete observations, affine functions, (stochastic) branching, and conditioning on discrete events.Our key tool is probability generating functions:they provide a compact closed-form representation of distributions that are definable by programs, thus enabling the exact computation of posterior probabilities, expectation, variance, and higher moments.Our inference method is provably correct and fully automated in a tool called Genfer, which uses automatic differentiation (specifically, Taylor polynomials), but does not require computer algebra.Our experiments show that Genfer is often faster than the existing exact inference tools PSI, Dice, and Prodigy.On a range of real-world inference problems that none of these exact tools can solve, Genfer's performance is competitive with approximate Monte Carlo methods, while avoiding approximation errors.

Chat is not available.