Skip to yearly menu bar Skip to main content


VRA: Variational Rectified Activation for Out-of-distribution Detection

Mingyu Xu · Zheng Lian · Bin Liu · Jianhua Tao

Great Hall & Hall B1+B2 (level 1) #1700
[ ] [ Project Page ]
[ Paper [ Slides [ OpenReview
Wed 13 Dec 3 p.m. PST — 5 p.m. PST


Out-of-distribution (OOD) detection is critical to building reliable machine learning systems in the open world. Researchers have proposed various strategies to reduce model overconfidence on OOD data. Among them, ReAct is a typical and effective technique to deal with model overconfidence, which truncates high activations to increase the gap between in-distribution and OOD. Despite its promising results, is this technique the best choice? To answer this question, we leverage the variational method to find the optimal operation and verify the necessity of suppressing abnormally low and high activations and amplifying intermediate activations in OOD detection, rather than focusing only on high activations like ReAct. This motivates us to propose a novel technique called ``Variational Rectified Activation (VRA)'', which simulates these suppression and amplification operations using piecewise functions. Experimental results on multiple benchmark datasets demonstrate that our method outperforms existing post-hoc strategies. Meanwhile, VRA is compatible with different scoring functions and network architectures. Our code is available at

Chat is not available.