Timezone: »

Natural-Parameter Networks: A Class of Probabilistic Neural Networks
Hao Wang · Xingjian SHI · Dit-Yan Yeung

Wed Dec 07 09:00 AM -- 12:30 PM (PST) @ Area 5+6+7+8 #40 #None

Neural networks (NN) have achieved state-of-the-art performance in various applications. Unfortunately in applications where training data is insufficient, they are often prone to overfitting. One effective way to alleviate this problem is to exploit the Bayesian approach by using Bayesian neural networks (BNN). Another shortcoming of NN is the lack of flexibility to customize different distributions for the weights and neurons according to the data, as is often done in probabilistic graphical models. To address these problems, we propose a class of probabilistic neural networks, dubbed natural-parameter networks (NPN), as a novel and lightweight Bayesian treatment of NN. NPN allows the usage of arbitrary exponential-family distributions to model the weights and neurons. Different from traditional NN and BNN, NPN takes distributions as input and goes through layers of transformation before producing distributions to match the target output distributions. As a Bayesian treatment, efficient backpropagation (BP) is performed to learn the natural parameters for the distributions over both the weights and neurons. The output distributions of each layer, as byproducts, may be used as second-order representations for the associated tasks such as link prediction. Experiments on real-world datasets show that NPN can achieve state-of-the-art performance.

Author Information

Hao Wang (HKUST)
Xingjian SHI (Hong Kong University of Science and Technology)
Dit-Yan Yeung (Hong Kong University of Science and Technology)

More from the Same Authors