Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Human in the Loop Learning (HiLL) Workshop at NeurIPS 2022

Batch Active Learning from the Perspective of Sparse Approximation

Maohao Shen · Yibo Jacky Zhang · Bowen Jiang · Sanmi Koyejo


Abstract:

Active learning enables efficient model training by leveraging interactions between machine learning agents and human annotators. We study and propose a novel framework that formulates batch active learning from the sparse approximation's perspective. Our active learning method aims to find an informative subset from the unlabeled data pool such that the corresponding training loss function approximates its full data pool counterpart. We realize the framework as sparsity-constrained discontinuous optimization problems, which explicitly balance uncertainty and representation for large-scale applications and could be solved by greedy or proximal iterative hard thresholding algorithms. The proposed method can adapt to various settings, including both Bayesian and non-Bayesian neural networks. Numerical experiments show that our work achieves competitive performance across different settings with lower computational complexity.

Chat is not available.