Timezone: »

 
Poster
Spectrally-normalized margin bounds for neural networks
Peter Bartlett · Dylan J Foster · Matus Telgarsky

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #206

This paper presents a margin-based multiclass generalization bound for neural networks that scales with their margin-normalized "spectral complexity": their Lipschitz constant, meaning the product of the spectral norms of the weight matrices, times a certain correction factor. This bound is empirically investigated for a standard AlexNet network trained with SGD on the MNIST and CIFAR10 datasets, with both original and random labels; the bound, the Lipschitz constants, and the excess risks are all in direct correlation, suggesting both that SGD selects predictors whose complexity scales with the difficulty of the learning task, and secondly that the presented bound is sensitive to this complexity.

Author Information

Peter Bartlett (UC Berkeley)
Peter Bartlett

Peter Bartlett is professor of Computer Science and Statistics at the University of California at Berkeley, Associate Director of the Simons Institute for the Theory of Computing, and Director of the Foundations of Data Science Institute. He has previously held positions at the Queensland University of Technology, the Australian National University and the University of Queensland. His research interests include machine learning and statistical learning theory, and he is the co-author of the book Neural Network Learning: Theoretical Foundations. He has been Institute of Mathematical Statistics Medallion Lecturer, winner of the Malcolm McIntosh Prize for Physical Scientist of the Year, and Australian Laureate Fellow, and he is a Fellow of the IMS, Fellow of the ACM, and Fellow of the Australian Academy of Science.

Dylan J Foster (Cornell University)
Matus Telgarsky (UIUC)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors