Timezone: »

Poster
On Tracking The Partition Function
Guillaume Desjardins · Aaron Courville · Yoshua Bengio

Mon Dec 12 10:00 AM -- 02:59 PM (PST) @
Markov Random Fields (MRFs) have proven very powerful both as density estimators and feature extractors for classification. However, their use is often limited by an inability to estimate the partition function $Z$. In this paper, we exploit the gradient descent training procedure of restricted Boltzmann machines (a type of MRF) to {\bf track} the log partition function during learning. Our method relies on two distinct sources of information: (1) estimating the change $\Delta Z$ incurred by each gradient update, (2) estimating the difference in $Z$ over a small set of tempered distributions using bridge sampling. The two sources of information are then combined using an inference procedure similar to Kalman filtering. Learning MRFs through Tempered Stochastic Maximum Likelihood, we can estimate $Z$ using no more temperatures than are required for learning. Comparing to both exact values and estimates using annealed importance sampling (AIS), we show on several datasets that our method is able to accurately track the log partition function. In contrast to AIS, our method provides this estimate at each time-step, at a computational cost similar to that required for training alone.

#### Author Information

##### Yoshua Bengio (Mila / U. Montreal)

Yoshua Bengio (PhD'1991 in Computer Science, McGill University). After two post-doctoral years, one at MIT with Michael Jordan and one at AT&T Bell Laboratories with Yann LeCun, he became professor at the department of computer science and operations research at UniversitÃ© de MontrÃ©al. Author of two books (a third is in preparation) and more than 200 publications, he is among the most cited Canadian computer scientists and is or has been associate editor of the top journals in machine learning and neural networks. Since '2000 he holds a Canada Research Chair in Statistical Learning Algorithms, since '2006 an NSERC Chair, since '2005 his is a Senior Fellow of the Canadian Institute for Advanced Research and since 2014 he co-directs its program focused on deep learning. He is on the board of the NIPS foundation and has been program chair and general chair for NIPS. He has co-organized the Learning Workshop for 14 years and co-created the International Conference on Learning Representations. His interests are centered around a quest for AI through machine learning, and include fundamental questions on deep learning, representation learning, the geometry of generalization in high-dimensional spaces, manifold learning and biologically inspired learning algorithms.