Skip to yearly menu bar Skip to main content


Poster

Combinatorial Inference against Label Noise

Paul Hongsuck Seo · Geeho Kim · Bohyung Han

East Exhibition Hall B + C #125

Keywords: [ Large Scale Learning; Deep Learning ] [ Algorithms ] [ Deep Learning ] [ Supervised Deep Networks ]


Abstract: Label noise is one of the critical sources that degrade generalization performance of deep neural networks significantly. To handle the label noise issue in a principled way, we propose a unique classification framework of constructing multiple models in heterogeneous coarse-grained meta-class spaces and making joint inference of the trained models for the final predictions in the original (base) class space. Our approach reduces noise level by simply constructing meta-classes and improves accuracy via combinatorial inferences over multiple constituent classifiers. Since the proposed framework has distinct and complementary properties for the given problem, we can even incorporate additional off-the-shelf learning algorithms to improve accuracy further. We also introduce techniques to organize multiple heterogeneous meta-class sets using $k$-means clustering and identify a desirable subset leading to learn compact models. Our extensive experiments demonstrate outstanding performance in terms of accuracy and efficiency compared to the state-of-the-art methods under various synthetic noise configurations and in a real-world noisy dataset.

Live content is unavailable. Log in and register to view live content