Skip to yearly menu bar Skip to main content


Poster

Minimum norm interpolation by perceptra: Explicit regularization and implicit bias

Jiyoung Park · Ian Pelakh · Stephan Wojtowytsch

Great Hall & Hall B1+B2 (level 1) #814
[ ]
[ Paper [ Poster [ OpenReview
Wed 13 Dec 8:45 a.m. PST — 10:45 a.m. PST

Abstract:

We investigate how shallow ReLU networks interpolate between known regions. Our analysis shows that empirical risk minimizers converge to a minimum norm interpolant as the number of data points and parameters tends to infinity when a weight decay regularizer is penalized with a coefficient which vanishes at a precise rate as the network width and the number of data points grow. With and without explicit regularization, we numerically study the implicit bias of common optimization algorithms towards known minimum norm interpolants.

Chat is not available.