Timezone: »
This work continues the study of the relationship between sample compression schemes and statistical learning, which has been mostly investigated within the framework of binary classification. We first extend the investigation to multiclass categorization: we prove that in this case learnability is equivalent to compression of logarithmic sample size and that the uniform convergence property implies compression of constant size. We use the compressibility-learnability equivalence to show that (i) for multiclass categorization, PAC and agnostic PAC learnability are equivalent, and (ii) to derive a compactness theorem for learnability. We then consider supervised learning under general loss functions: we show that in this case, in order to maintain the compressibility-learnability equivalence, it is necessary to consider an approximate variant of compression. We use it to show that PAC and agnostic PAC are not equivalent, even when the loss function has only three values.
Author Information
Ofir David (Technion - Israel institute of technology)
Shay Moran (Technion - Israel institue of Technology)
Amir Yehudayoff (Technion - Israel institue of Technology)
More from the Same Authors
-
2021 Spotlight: Towards a Unified Information-Theoretic Framework for Generalization »
Mahdi Haghifam · Gintare Karolina Dziugaite · Shay Moran · Dan Roy -
2022 Poster: Integral Probability Metrics PAC-Bayes Bounds »
Ron Amit · Baruch Epstein · Shay Moran · Ron Meir -
2022 Poster: Universal Rates for Interactive Learning »
Steve Hanneke · Amin Karbasi · Shay Moran · Grigoris Velegkas -
2022 Poster: On Optimal Learning Under Targeted Data Poisoning »
Steve Hanneke · Amin Karbasi · Mohammad Mahmoody · Idan Mehalel · Shay Moran -
2021 Poster: Multiclass Boosting and the Cost of Weak Learning »
Nataly Brukhim · Elad Hazan · Shay Moran · Indraneel Mukherjee · Robert Schapire -
2021 Poster: Towards a Unified Information-Theoretic Framework for Generalization »
Mahdi Haghifam · Gintare Karolina Dziugaite · Shay Moran · Dan Roy -
2020 Poster: Synthetic Data Generators -- Sequential and Private »
Olivier Bousquet · Roi Livni · Shay Moran -
2020 Poster: Learning from Mixtures of Private and Public Populations »
Raef Bassily · Shay Moran · Anupama Nandi -
2020 Poster: Online Agnostic Boosting via Regret Minimization »
Nataly Brukhim · Xinyi Chen · Elad Hazan · Shay Moran -
2020 Poster: A Limitation of the PAC-Bayes Framework »
Roi Livni · Shay Moran -
2019 Poster: Private Learning Implies Online Learning: An Efficient Reduction »
Alon Gonen · Elad Hazan · Shay Moran -
2019 Spotlight: Private Learning Implies Online Learning: An Efficient Reduction »
Alon Gonen · Elad Hazan · Shay Moran -
2019 Poster: An adaptive nearest neighbor rule for classification »
Akshay Balsubramani · Sanjoy Dasgupta · yoav Freund · Shay Moran -
2019 Spotlight: An adaptive nearest neighbor rule for classification »
Akshay Balsubramani · Sanjoy Dasgupta · yoav Freund · Shay Moran -
2019 Poster: Learning to Screen »
Alon Cohen · Avinatan Hassidim · Haim Kaplan · Yishay Mansour · Shay Moran -
2019 Poster: Limits of Private Learning with Access to Public Data »
Raef Bassily · Shay Moran · Noga Alon -
2017 Poster: Submultiplicative Glivenko-Cantelli and Uniform Convergence of Revenues »
Noga Alon · Moshe Babaioff · Yannai A. Gonczarowski · Yishay Mansour · Shay Moran · Amir Yehudayoff -
2017 Spotlight: Submultiplicative Glivenko-Cantelli and Uniform Convergence of Revenues »
Noga Alon · Moshe Babaioff · Yannai A. Gonczarowski · Yishay Mansour · Shay Moran · Amir Yehudayoff -
2016 Oral: Supervised learning through the lens of compression »
Ofir David · Shay Moran · Amir Yehudayoff