Skip to yearly menu bar Skip to main content


Poster

Provable Tempered Overfitting of Minimal Nets and Typical Nets

Itamar Harel · William Hoza · Gal Vardi · Itay Evron · Nati Srebro · Daniel Soudry

East Exhibit Hall A-C #2406
[ ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

We study the overfitting behavior of fully connected Deep Neural Networks (DNNs) with binary weights fitted to perfectly classify a noisy training set. We consider interpolation using both the smallest DNN (having the minimal number of weights) and a random interpolating DNN. For both learning rules, we prove overfitting is tempered. Our analysis rests on a new bound on the size of a threshold circuit consistent with a partial function.

Live content is unavailable. Log in and register to view live content