Timezone: »
Poster
Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds and Benign Overfitting
Frederic Koehler · Lijia Zhou · Danica J. Sutherland · Nathan Srebro
We consider interpolation learning in high-dimensional linear regression with Gaussian data, and prove a generic uniform convergence guarantee on the generalization error of interpolators in an arbitrary hypothesis class in terms of the class’s Gaussian width. Applying the generic bound to Euclidean norm balls recovers the consistency result of Bartlett et al. (2020) for minimum-norm interpolators, and confirms a prediction of Zhou et al. (2020) for near-minimal-norm interpolators in the special case of Gaussian data. We demonstrate the generality of the bound by applying it to the simplex, obtaining a novel consistency result for minimum $\ell_1$-norm interpolators (basis pursuit). Our results show how norm-based generalization bounds can explain and be used to analyze benign overfitting, at least in some settings.
Author Information
Frederic Koehler (MIT)
Lijia Zhou (University of Chicago)
Danica J. Sutherland (University of British Columbia)
Nathan Srebro (University of Toronto)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Oral: Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds and Benign Overfitting »
Fri. Dec 10th 08:20 -- 08:35 AM Room
More from the Same Authors
-
2021 Spotlight: On the Power of Differentiable Learning versus PAC and SQ Learning »
Emmanuel Abbe · Pritish Kamath · Eran Malach · Colin Sandon · Nathan Srebro -
2021 : Exponential Family Model-Based Reinforcement Learning via Score Matching »
Gene Li · Junbo Li · Nathan Srebro · Zhaoran Wang · Zhuoran Yang -
2022 : Statistical Efficiency of Score Matching: The View from Isoperimetry »
Frederic Koehler · Alexander Heckett · Andrej Risteski -
2022 Panel: Panel 2C-7: Optimal Rates for… & Reconstruction on Trees… »
Frederic Koehler · Zhu Li -
2022 Poster: A Non-Asymptotic Moreau Envelope Theory for High-Dimensional Generalized Linear Models »
Lijia Zhou · Frederic Koehler · Pragya Sur · Danica J. Sutherland · Nati Srebro -
2022 Poster: Reconstruction on Trees and Low-Degree Polynomials »
Frederic Koehler · Elchanan Mossel -
2022 Poster: Lower Bounds on Randomly Preconditioned Lasso via Robust Sparse Designs »
Jonathan Kelner · Frederic Koehler · Raghu Meka · Dhruv Rohatgi -
2021 Poster: On the Power of Differentiable Learning versus PAC and SQ Learning »
Emmanuel Abbe · Pritish Kamath · Eran Malach · Colin Sandon · Nathan Srebro -
2021 Poster: Representation Costs of Linear Neural Networks: Analysis and Design »
Zhen Dai · Mina Karzand · Nathan Srebro -
2021 Poster: An Even More Optimal Stochastic Optimization Algorithm: Minibatching and Interpolation Learning »
Blake Woodworth · Nathan Srebro -
2021 Poster: A Stochastic Newton Algorithm for Distributed Convex Optimization »
Brian Bullins · Kshitij Patel · Ohad Shamir · Nathan Srebro · Blake Woodworth -
2021 Poster: Self-Supervised Learning with Kernel Dependence Maximization »
Yazhe Li · Roman Pogodin · Danica J. Sutherland · Arthur Gretton -
2021 Poster: Meta Two-Sample Testing: Learning Kernels for Testing with Limited Data »
Feng Liu · Wenkai Xu · Jie Lu · Danica J. Sutherland -
2020 Poster: Learning Some Popular Gaussian Graphical Models without Condition Number Bounds »
Jonathan Kelner · Frederic Koehler · Raghu Meka · Ankur Moitra -
2020 Poster: From Boltzmann Machines to Neural Networks and Back Again »
Surbhi Goel · Adam Klivans · Frederic Koehler -
2020 Spotlight: Learning Some Popular Gaussian Graphical Models without Condition Number Bounds »
Jonathan Kelner · Frederic Koehler · Raghu Meka · Ankur Moitra -
2020 Poster: On Uniform Convergence and Low-Norm Interpolation Learning »
Lijia Zhou · Danica J. Sutherland · Nati Srebro -
2020 Spotlight: On Uniform Convergence and Low-Norm Interpolation Learning »
Lijia Zhou · Danica J. Sutherland · Nati Srebro -
2020 Poster: Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Evolvability »
Sitan Chen · Frederic Koehler · Ankur Moitra · Morris Yau -
2020 Spotlight: Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Evolvability »
Sitan Chen · Frederic Koehler · Ankur Moitra · Morris Yau -
2019 Poster: Fast Convergence of Belief Propagation to Global Optima: Beyond Correlation Decay »
Frederic Koehler -
2019 Spotlight: Fast Convergence of Belief Propagation to Global Optima: Beyond Correlation Decay »
Frederic Koehler -
2019 Tutorial: Interpretable Comparison of Distributions and Models »
Wittawat Jitkrittum · Danica J. Sutherland · Arthur Gretton -
2018 Poster: On gradient regularizers for MMD GANs »
Michael Arbel · Danica J. Sutherland · Mikołaj Bińkowski · Arthur Gretton -
2017 Poster: Information Theoretic Properties of Markov Random Fields, and their Algorithmic Applications »
Linus Hamilton · Frederic Koehler · Ankur Moitra