Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2021: Optimization for Machine Learning

The Geometric Occam Razor Implicit in Deep Learning

Benoit Dherin · Michael Munn · David Barrett


Abstract:

In over-parameterized deep neural networks there can be many possible parameter configurations that fit the training data exactly. However, the properties of these interpolating solutions are poorly understood. We argue that over-parameterized neural networks trained with stochastic gradient descent are subject to a Geometric Occam Razor: these networks are implicitly regularized by the geometric model complexity. For one-dimensional regression, the geometric model complexity is simply given by the arc length of the function. For higher-dimensional settings, the geometric model complexity depends on the Dirichlet energy of the function. We explore the relationship between the Geometric Occam Razor, Dirichlet energy and known forms of implicit regularization. Finally, for ResNets trained on Cifar-10, we observe that Dirichlet energy measurements are consistent with the action of this implicit Geometric Occam Razor.

Chat is not available.