Skip to yearly menu bar Skip to main content


Poster

Approximate Inference Turns Deep Networks into Gaussian Processes

Mohammad Emtiyaz Khan · Alexander Immer · Ehsan Abedi · Maciej Korzepa

East Exhibition Hall B + C #164

Keywords: [ Gaussian Processes ] [ Variational Inference ] [ Probabilistic Methods ]


Abstract:

Deep neural networks (DNN) and Gaussian processes (GP) are two powerful models with several theoretical connections relating them, but the relationship between their training methods is not well understood. In this paper, we show that certain Gaussian posterior approximations for Bayesian DNNs are equivalent to GP posteriors. This enables us to relate solutions and iterations of a deep-learning algorithm to GP inference. As a result, we can obtain a GP kernel and a nonlinear feature map while training a DNN. Surprisingly, the resulting kernel is the neural tangent kernel. We show kernels obtained on real datasets and demonstrate the use of the GP marginal likelihood to tune hyperparameters of DNNs. Our work aims to facilitate further research on combining DNNs and GPs in practical settings.

Live content is unavailable. Log in and register to view live content