Timezone: »
Detecting out-of-distribution (OOD) data has become a critical component in ensuring the safe deployment of machine learning models in the real world. Existing OOD detection approaches primarily rely on the output or feature space for deriving OOD scores, while largely overlooking information from the gradient space. In this paper, we present GradNorm, a simple and effective approach for detecting OOD inputs by utilizing information extracted from the gradient space. GradNorm directly employs the vector norm of gradients, backpropagated from the KL divergence between the softmax output and a uniform probability distribution. Our key idea is that the magnitude of gradients is higher for in-distribution (ID) data than that for OOD data, making it informative for OOD detection. GradNorm demonstrates superior performance, reducing the average FPR95 by up to 16.33% compared to the previous best method.
Author Information
Rui Huang (University of Wisconsin, Madison)
Andrew Geng (University of Wisconsin, Madison)
Yixuan Li (University of Wisconsin-Madison)
More from the Same Authors
-
2021 Poster: Can multi-label classification networks know what they don’t know? »
Haoran Wang · Weitang Liu · Alex Bocchieri · Yixuan Li -
2021 Poster: ReAct: Out-of-distribution Detection With Rectified Activations »
Yiyou Sun · Chuan Guo · Yixuan Li -
2020 Poster: Energy-based Out-of-distribution Detection »
Weitang Liu · Xiaoyun Wang · John Owens · Yixuan Li