Timezone: »

 
Poster
Bilevel Coreset Selection in Continual Learning: A New Formulation and Algorithm
Jie Hao · Kaiyi Ji · Mingrui Liu

Tue Dec 12 03:15 PM -- 05:15 PM (PST) @ Great Hall & Hall B1+B2 #1023
Event URL: https://github.com/MingruiLiu-ML-Lab/Bilevel-Coreset-Selection-via-Regularization »
Coreset is a small set that provides a data summary for a large dataset, such that training solely on the small set achieves competitive performance compared with a large dataset. In rehearsal-based continual learning, the coreset is typically used in the memory replay buffer to stand for representative samples in previous tasks, and the coreset selection procedure is typically formulated as a bilevel problem. However, the typical bilevel formulation for coreset selection explicitly performs optimization over discrete decision variables with greedy search, which is computationally expensive. Several works consider other formulations to address this issue, but they ignore the nested nature of bilevel optimization problems and may not solve the bilevel coreset selection problem accurately. To address these issues, we propose a new bilevel formulation, where the inner problem tries to find a model which minimizes the expected training error sampled from a given probability distribution, and the outer problem aims to learn the probability distribution with approximately $K$ (coreset size) nonzero entries such that learned model in the inner problem minimizes the training error over the whole data. To ensure the learned probability has approximately $K$ nonzero entries, we introduce a novel regularizer based on the smoothed top-$K$ loss in the upper problem. We design a new optimization algorithm that provably converges to the $\epsilon$-stationary point with $O(1/\epsilon^4)$ computational complexity. We conduct extensive experiments in various settings in continual learning, including balanced data, imbalanced data, and label noise, to show that our proposed formulation and new algorithm significantly outperform competitive baselines. From bilevel optimization point of view, our algorithm significantly improves the vanilla greedy coreset selection method in terms of running time on continual learning benchmark datasets. The code is available at https://github.com/MingruiLiu-ML-Lab/Bilevel-Coreset-Selection-via-Regularization.

Author Information

Jie Hao (George Mason University)
Kaiyi Ji (University at Buffalo)

Kaiyi Ji is now an assistant professor at the Department of Computer Science and Engineering of the University at Buffalo. He was a postdoctoral research fellow at the Electrical Engineering and Computer Science Department of the University of Michigan, Ann Arbor, in 2022, working with Prof. Lei Ying. He received his Ph.D. degree from the Electrical and Computer Engineering Department of The Ohio State University in December, 2021, advised by Prof. Yingbin Liang. He was a visiting student research collaborator at the department of Electrical Engineering, Princeton University working with Prof. H. Vincent Poor. Previously he obtained his B.S. degree from University of Science and Technology of China in 2016.

Mingrui Liu (George Mason University)

More from the Same Authors