Skip to yearly menu bar Skip to main content


Poster

Evidential Mixture Machines: Deciphering Multi-Label Correlations for Active Learning Sensitivity

Dayou Yu · Minghao Li · Weishi Shi · Qi Yu

West Ballroom A-D #6706
[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Multi-label active learning is an essential and challenging aspect of contemporary machine learning, often hindered by the complexities of managing expansive and sparse label spaces. This challenge is intensified in active learning scenarios where labeling resources are limited. Drawing inspiration from existing mixture of Bernoulli models, which effectively compress the label space into a more manageable weight coefficient space through the learning of correlated Bernoulli components, we introduce a novel evidential mixture machines (EMM) model. This model leverages mixture components obtained through label-space unsupervised learning while enhancing prediction accuracy by learning to predict coefficients evidentially and aggregating component offset predictions as proxy pseudo counts. The evidential learning of coefficients provides an uncertainty-aware connection from input features to the predicted coefficients and components. Furthermore, our approach employs the evidential uncertainty combined with predicted label embedding covariances for active sample selection, leading to a multi-source uncertainty metric that is richer than simple uncertainty scores. Experiments on synthetic data demonstrate the effectiveness of evidential uncertainty prediction and capturing the label correlation using predicted components and experiments on real-world datasets show improved performance over existing multi-label active learning methods.

Live content is unavailable. Log in and register to view live content