Timezone: »
Poster
Towards a Unified Information-Theoretic Framework for Generalization
Mahdi Haghifam · Gintare Karolina Dziugaite · Shay Moran · Dan Roy
In this work, we investigate the expressiveness of the "conditional mutual information" (CMI) framework of Steinke and Zakynthinou (2020) and the prospect of using it to provide a unified framework for proving generalization bounds in the realizable setting. We first demonstrate that one can use this framework to express non-trivial (but sub-optimal) bounds for any learning algorithm that outputs hypotheses from a class of bounded VC dimension. We then explore two directions of strengthening this bound: (i) Can the CMI framework express optimal bounds for VC classes? (ii) Can the CMI framework be used to analyze algorithms whose output hypothesis space is unrestricted (i.e. has an unbounded VC dimension)? With respect to Item (i) we prove that the CMI framework yields the optimal bound on the expected risk of Support Vector Machines (SVMs) for learning halfspaces. This result is an application of our general result showing that stable compression schemes Bousquet al. (2020) of size $k$ have uniformly bounded CMI of order $O(k)$. We further show that an inherent limitation of proper learning of VC classes contradicts the existence of a proper learner with constant CMI, and it implies a negative resolution to an open problem of Steinke and Zakynthinou (2020). We further study the CMI of empirical risk minimizers (ERMs) of class $H$ and show that it is possible to output all consistent classifiers (version space) with bounded CMI if and only if $H$ has a bounded star number (Hanneke and Yang (2015)). With respect to Item (ii) we prove a general reduction showing that "leave-one-out" analysis is expressible via the CMI framework. As a corollary we investigate the CMI of the one-inclusion-graph algorithm proposed by Haussler et al. (1994). More generally, we show that the CMI framework is universal in the sense that for every consistent algorithm and data distribution, the expected risk vanishes as the number of samples diverges if and only if its evaluated CMI has sublinear growth with the number of samples.
Author Information
Mahdi Haghifam (University of Toronto/Vector Institute)
Gintare Karolina Dziugaite (Google Research, Brain Team)
Shay Moran (Technion)
Dan Roy (University of Toronto)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: Towards a Unified Information-Theoretic Framework for Generalization »
Dates n/a. Room
More from the Same Authors
-
2021 : Stochastic Pruning: Fine-Tuning, and PAC-Bayes bound optimization »
Soufiane Hayou · Bobby He · Gintare Karolina Dziugaite -
2021 : The Dynamics of Functional Diversity throughout Neural Network Training »
Lee Zamparo · Marc-Etienne Brunet · Thomas George · Sepideh Kharaghani · Gintare Karolina Dziugaite -
2022 Poster: Integral Probability Metrics PAC-Bayes Bounds »
Ron Amit · Baruch Epstein · Shay Moran · Ron Meir -
2022 Poster: Universal Rates for Interactive Learning »
Steve Hanneke · Amin Karbasi · Shay Moran · Grigoris Velegkas -
2022 Poster: On Optimal Learning Under Targeted Data Poisoning »
Steve Hanneke · Amin Karbasi · Mohammad Mahmoody · Idan Mehalel · Shay Moran -
2021 Poster: The future is log-Gaussian: ResNets and their infinite-depth-and-width limit at initialization »
Mufan Li · Mihai Nica · Dan Roy -
2021 Poster: Minimax Optimal Quantile and Semi-Adversarial Regret via Root-Logarithmic Regularizers »
Jeffrey Negrea · Blair Bilodeau · Nicolò Campolongo · Francesco Orabona · Dan Roy -
2021 Poster: Deep Learning on a Data Diet: Finding Important Examples Early in Training »
Mansheej Paul · Surya Ganguli · Gintare Karolina Dziugaite -
2021 Poster: Multiclass Boosting and the Cost of Weak Learning »
Nataly Brukhim · Elad Hazan · Shay Moran · Indraneel Mukherjee · Robert Schapire -
2020 Poster: Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms »
Mahdi Haghifam · Jeffrey Negrea · Ashish Khisti · Daniel Roy · Gintare Karolina Dziugaite -
2020 Poster: Synthetic Data Generators -- Sequential and Private »
Olivier Bousquet · Roi Livni · Shay Moran -
2020 Poster: Learning from Mixtures of Private and Public Populations »
Raef Bassily · Shay Moran · Anupama Nandi -
2020 Poster: Online Agnostic Boosting via Regret Minimization »
Nataly Brukhim · Xinyi Chen · Elad Hazan · Shay Moran -
2020 Poster: A Limitation of the PAC-Bayes Framework »
Roi Livni · Shay Moran -
2019 Poster: Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates »
Jeffrey Negrea · Mahdi Haghifam · Gintare Karolina Dziugaite · Ashish Khisti · Daniel Roy -
2019 Poster: Private Learning Implies Online Learning: An Efficient Reduction »
Alon Gonen · Elad Hazan · Shay Moran -
2019 Spotlight: Private Learning Implies Online Learning: An Efficient Reduction »
Alon Gonen · Elad Hazan · Shay Moran -
2019 Poster: An adaptive nearest neighbor rule for classification »
Akshay Balsubramani · Sanjoy Dasgupta · yoav Freund · Shay Moran -
2019 Spotlight: An adaptive nearest neighbor rule for classification »
Akshay Balsubramani · Sanjoy Dasgupta · yoav Freund · Shay Moran -
2019 Poster: Learning to Screen »
Alon Cohen · Avinatan Hassidim · Haim Kaplan · Yishay Mansour · Shay Moran -
2019 Poster: Limits of Private Learning with Access to Public Data »
Raef Bassily · Shay Moran · Noga Alon -
2018 Poster: Data-dependent PAC-Bayes priors via differential privacy »
Gintare Karolina Dziugaite · Daniel Roy -
2017 Poster: Submultiplicative Glivenko-Cantelli and Uniform Convergence of Revenues »
Noga Alon · Moshe Babaioff · Yannai A. Gonczarowski · Yishay Mansour · Shay Moran · Amir Yehudayoff -
2017 Spotlight: Submultiplicative Glivenko-Cantelli and Uniform Convergence of Revenues »
Noga Alon · Moshe Babaioff · Yannai A. Gonczarowski · Yishay Mansour · Shay Moran · Amir Yehudayoff -
2016 Poster: Supervised learning through the lens of compression »
Ofir David · Shay Moran · Amir Yehudayoff -
2016 Oral: Supervised learning through the lens of compression »
Ofir David · Shay Moran · Amir Yehudayoff