Timezone: »
Probabilistic Circuits (PCs) are a promising avenue for probabilistic modeling. They combine advantages of probabilistic graphical models (PGMs) with those of neural networks (NNs). Crucially, however, they are tractable probabilistic models, supporting efficient and exact computation of many probabilistic inference queries, such as marginals and MAP. Further, since PCs are structured computation graphs, they can take advantage of deep-learning-style parameter updates, which greatly improves their scalability. However, this innovation also makes PCs prone to overfitting, which has been observed in many standard benchmarks. Despite the existence of abundant regularization techniques for both PGMs and NNs, they are not effective enough when applied to PCs. Instead, we re-think regularization for PCs and propose two intuitive techniques, data softening and entropy regularization, that both take advantage of PCs' tractability and still have an efficient implementation as a computation graph. Specifically, data softening provides a principled way to add uncertainty in datasets in closed form, which implicitly regularizes PC parameters. To learn parameters from a softened dataset, PCs only need linear time by virtue of their tractability. In entropy regularization, the exact entropy of the distribution encoded by a PC can be regularized directly, which is again infeasible for most other density estimation models. We show that both methods consistently improve the generalization performance of a wide variety of PCs. Moreover, when paired with a simple PC structure, we achieved state-of-the-art results on 10 out of 20 standard discrete density estimation benchmarks. Open-source code and experiments are available at https://github.com/UCLA-StarAI/Tractable-PC-Regularization.
Author Information
Anji Liu (University of California, Los Angeles)
Guy Van den Broeck (UCLA)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: Tractable Regularization of Probabilistic Circuits »
Dates n/a. Room
More from the Same Authors
-
2022 Poster: Efficient Meta Reinforcement Learning for Preference-based Fast Adaptation »
Zhizhou Ren · Anji Liu · Yitao Liang · Jian Peng · Jianzhu Ma -
2022 Poster: Sparse Probabilistic Circuits via Pruning and Growing »
Meihua Dang · Anji Liu · Guy Van den Broeck -
2021 Poster: A Compositional Atlas of Tractable Circuit Operations for Probabilistic Inference »
Antonio Vergari · YooJung Choi · Anji Liu · Stefano Teso · Guy Van den Broeck -
2021 Oral: A Compositional Atlas of Tractable Circuit Operations for Probabilistic Inference »
Antonio Vergari · YooJung Choi · Anji Liu · Stefano Teso · Guy Van den Broeck -
2016 Poster: New Liftable Classes for First-Order Probabilistic Inference »
Seyed Mehran Kazemi · Angelika Kimmig · Guy Van den Broeck · David Poole -
2015 Poster: Tractable Learning for Complex Probability Queries »
Jessa Bekker · Jesse Davis · Arthur Choi · Adnan Darwiche · Guy Van den Broeck -
2013 Poster: On the Complexity and Approximation of Binary Evidence in Lifted Inference »
Guy Van den Broeck · Adnan Darwiche -
2013 Spotlight: On the Complexity and Approximation of Binary Evidence in Lifted Inference »
Guy Van den Broeck · Adnan Darwiche -
2011 Poster: On the Completeness of First-Order Knowledge Compilation for Lifted Probabilistic Inference »
Guy Van den Broeck -
2011 Oral: On the Completeness of First-Order Knowledge Compilation for Lifted Probabilistic Inference »
Guy Van den Broeck