Skip to yearly menu bar Skip to main content


Poster

Natural-Parameter Networks: A Class of Probabilistic Neural Networks

Hao Wang · Xingjian SHI · Dit-Yan Yeung

Area 5+6+7+8 #40

Keywords: [ Deep Learning or Neural Networks ] [ (Other) Unsupervised Learning Methods ] [ (Other) Classification ] [ (Other) Probabilistic Models and Methods ] [ (Other) Regression ]


Abstract:

Neural networks (NN) have achieved state-of-the-art performance in various applications. Unfortunately in applications where training data is insufficient, they are often prone to overfitting. One effective way to alleviate this problem is to exploit the Bayesian approach by using Bayesian neural networks (BNN). Another shortcoming of NN is the lack of flexibility to customize different distributions for the weights and neurons according to the data, as is often done in probabilistic graphical models. To address these problems, we propose a class of probabilistic neural networks, dubbed natural-parameter networks (NPN), as a novel and lightweight Bayesian treatment of NN. NPN allows the usage of arbitrary exponential-family distributions to model the weights and neurons. Different from traditional NN and BNN, NPN takes distributions as input and goes through layers of transformation before producing distributions to match the target output distributions. As a Bayesian treatment, efficient backpropagation (BP) is performed to learn the natural parameters for the distributions over both the weights and neurons. The output distributions of each layer, as byproducts, may be used as second-order representations for the associated tasks such as link prediction. Experiments on real-world datasets show that NPN can achieve state-of-the-art performance.

Live content is unavailable. Log in and register to view live content