Poster
Provable Tempered Overfitting of Minimal Nets and Typical Nets
Itamar Harel · William Hoza · Gal Vardi · Itay Evron · Nati Srebro · Daniel Soudry
East Exhibit Hall A-C #2406
[
Abstract
]
Wed 11 Dec 4:30 p.m. PST
— 7:30 p.m. PST
Abstract:
We study the overfitting behavior of fully connected deep Neural Networks (NNs) with binary weights fitted to perfectly classify a noisy training set. We consider interpolation using both the smallest NN (having the minimal number of weights) and a random interpolating NN. For both learning rules, we prove overfitting is tempered. Our analysis rests on a new bound on the size of a threshold circuit consistent with a partial function. To the best of our knowledge, ours are the first theoretical results on benign or tempered overfitting that: (1) apply to deep NNs, and (2) do not require a very high or very low input dimension.
Live content is unavailable. Log in and register to view live content