In large-scale applications of undirected graphical models, such as social networks and biological networks, similar patterns occur frequently and give rise to similar parameters. In this situation, it is beneficial to group the parameters for more efficient learning. We show that even when the grouping is unknown, we can infer these parameter groups during learning via a Bayesian approach. We impose a Dirichlet process prior on the parameters. Posterior inference usually involves calculating intractable terms, and we propose two approximation algorithms, namely a Metropolis-Hastings algorithm with auxiliary variables and a Gibbs sampling algorithm with stripped Beta approximation (GibbsSBA). Simulations show that both algorithms outperform conventional maximum likelihood estimation (MLE). GibbsSBA's performance is close to Gibbs sampling with exact likelihood calculation. Models learned with Gibbs_SBA also generalize better than the models learned by MLE on real-world Senate voting data.
Jie Liu (UW-Madison)
David Page (UW-Madison)
More from the Same Authors
2017 Poster: A Screening Rule for l1-Regularized Ising Model Estimation »
Zhaobin Kuang · Sinong Geng · David Page
2012 Poster: Multiplicative Forests for Continuous-Time Processes »
Jeremy C Weiss · Sriraam Natarajan · David Page