Skip to yearly menu bar Skip to main content


Poster

The Best of Both Worlds: On the Dilemma of Out-of-distribution Detection

Qingyang Zhang · Qiuxuan Feng · Joey Tianyi Zhou · Yatao Bian · Qinghua Hu · Changqing Zhang

East Exhibit Hall A-C #4408
[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Out-of-distribution (OOD) detection is essential for model trustworthiness which aims to sensitively identity semantic OOD samples and robustly generalize for covariate-shifted OOD samples. However, we discover that the superior OOD detection performance of state-of-the-art methods is achieved by secretly sacrificing the OOD generalization ability. The classification accuracy frequently collapses catastrophically when even slight noise is encountered. Such a phenomenon violates the motivation of trustworthiness and significantly limits the model's deployment in the real world. What is the hidden reason behind such a limitation? In this work, we theoretically demystify the "\textit{sensitive-robust}" dilemma that lies in previous OOD detection methods. Consequently, a theory-inspired algorithm is induced to overcome such a dilemma. By decoupling the uncertainty learning objective from a Bayesian perspective, the conflict between OOD detection and OOD generalization is naturally harmonized and a dual-optimized performance could be expected. Empirical studies show that our method achieves superior performance on commonly used benchmarks. To our best knowledge, this work is the first principled OOD detection method that achieves state-of-the-art OOD detection performance without sacrificing OOD generalization ability. Our code is available at https://github.com/QingyangZhang/DUL.

Live content is unavailable. Log in and register to view live content