Timezone: »

 
Poster
Sparse Probabilistic Circuits via Pruning and Growing
Meihua Dang · Anji Liu · Guy Van den Broeck

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #904

Probabilistic circuits (PCs) are a tractable representation of probability distributions allowing for exact and efficient computation of likelihoods and marginals. There has been significant recent progress on improving the scale and expressiveness of PCs. However, PC training performance plateaus as model size increases. We discover that most capacity in existing large PC structures is wasted: fully-connected parameter layers are only sparsely used. We propose two operations: pruning and growing, that exploit the sparsity of PC structures. Specifically, the pruning operation removes unimportant sub-networks of the PC for model compression and comes with theoretical guarantees. The growing operation increases model capacity by increasing the dimensions of latent states. By alternatingly applying pruning and growing, we increase the capacity that is meaningfully used, allowing us to significantly scale up PC learning. Empirically, our learner achieves state-of-the-art likelihoods on MNIST-family image datasets and an Penn Tree Bank language data compared to other PC learners and less tractable deep generative models such as flow-based models and variational autoencoders (VAEs).

Author Information

Meihua Dang (University of California, Los Angeles)
Anji Liu (University of California, Los Angeles)
Guy Van den Broeck (UCLA)

I am an Assistant Professor and Samueli Fellow at UCLA, in the Computer Science Department, where I direct the Statistical and Relational Artificial Intelligence (StarAI) lab. My research interests are in Machine Learning (Statistical Relational Learning, Tractable Learning), Knowledge Representation and Reasoning (Graphical Models, Lifted Probabilistic Inference, Knowledge Compilation), Applications of Probabilistic Reasoning and Learning (Probabilistic Programming, Probabilistic Databases), and Artificial Intelligence in general.

More from the Same Authors