`

Timezone: »

 
Spotlight
Spectrally-normalized margin bounds for neural networks
Peter Bartlett · Dylan J Foster · Matus Telgarsky

Wed Dec 06 11:50 AM -- 11:55 AM (PST) @ Hall A

We show that the margin distribution --- normalized by a spectral complexity parameter --- is strongly predictive of neural network generalization performance. Namely, we 1) Use the margin distribution to correctly predict whether deep neural networks generalize under changes to label distribution such as randomization. That is, the margin distribution accurately predicts the difficulty of deep learning tasks. We further show that normalizing the margin by the network's spectral complexity is critical to obtaining this predictive power, and finally use the margin distribution to compare the generalization performance of multiple networks across different datasets on even terms. Our corresponding generalization bound places these results on rigorous theoretical footing.

Author Information

Peter Bartlett (UC Berkeley)
Peter Bartlett

Peter Bartlett is professor of Computer Science and Statistics at the University of California at Berkeley, Associate Director of the Simons Institute for the Theory of Computing, and Director of the Foundations of Data Science Institute. He has previously held positions at the Queensland University of Technology, the Australian National University and the University of Queensland. His research interests include machine learning and statistical learning theory, and he is the co-author of the book Neural Network Learning: Theoretical Foundations. He has been Institute of Mathematical Statistics Medallion Lecturer, winner of the Malcolm McIntosh Prize for Physical Scientist of the Year, and Australian Laureate Fellow, and he is a Fellow of the IMS, Fellow of the ACM, and Fellow of the Australian Academy of Science.

Dylan J Foster (Cornell University)
Matus Telgarsky (UIUC)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors