## A Universal Law of Robustness via Isoperimetry

### Sebastien Bubeck · Mark Sellke

##### Virtual

Keywords: [ Theory ] [ Deep Learning ] [ Adversarial Robustness and Security ] [ Robustness ]

 Outstanding Paper
[ Abstract ]
[ [
Tue 7 Dec 8:30 a.m. PST — 10 a.m. PST

Oral presentation: Oral Session 1: Deep Learning Theory and Causality
Tue 7 Dec midnight PST — 1 a.m. PST

Abstract: Classically, data interpolation with a parametrized model class is possible as long as the number of parameters is larger than the number of equations to be satisfied. A puzzling phenomenon in the current practice of deep learning is that models are trained with many more parameters than what this classical theory would suggest. We propose a theoretical explanation for this phenomenon. We prove that for a broad class of data distributions and model classes, overparametrization is {\em necessary} if one wants to interpolate the data {\em smoothly}. Namely we show that {\em smooth} interpolation requires $d$ times more parameters than mere interpolation, where $d$ is the ambient data dimension. We prove this universal law of robustness for any smoothly parametrized function class with polynomial size weights, and any covariate distribution verifying isoperimetry. In the case of two-layers neural networks and Gaussian covariates, this law was conjectured in prior work by Bubeck, Li and Nagaraj. We also give an interpretation of our result as an improved generalization bound for model classes consisting of smooth functions.

Chat is not available.