Timezone: »
Ensemble approaches for uncertainty estimation have recently been applied to the tasks of misclassification detection, out-of-distribution input detection and adversarial attack detection. Prior Networks have been proposed as an approach to efficiently emulate an ensemble of models for classification by parameterising a Dirichlet prior distribution over output distributions. These models have been shown to outperform alternative ensemble approaches, such as Monte-Carlo Dropout, on the task of out-of-distribution input detection. However, scaling Prior Networks to complex datasets with many classes is difficult using the training criteria originally proposed. This paper makes two contributions. First, we show that the appropriate training criterion for Prior Networks is the reverse KL-divergence between Dirichlet distributions. This addresses issues in the nature of the training data target distributions, enabling prior networks to be successfully trained on classification tasks with arbitrarily many classes, as well as improving out-of-distribution detection performance. Second, taking advantage of this new training criterion, this paper investigates using Prior Networks to detect adversarial attacks and proposes a generalized form of adversarial training. It is shown that the construction of successful adaptive whitebox attacks, which affect the prediction and evade detection, against Prior Networks trained on CIFAR-10 and CIFAR-100 using the proposed approach requires a greater amount of computational effort than against networks defended using standard adversarial training or MC-dropout.
Author Information
Andrey Malinin (Yandex)
Mark Gales (University of Cambridge)
More from the Same Authors
-
2021 : Shifts: A Dataset of Real Distributional Shift Across Multiple Large-Scale Tasks »
Andrey Malinin · Neil Band · Yarin Gal · Mark Gales · Alexander Ganshin · German Chesnokov · Alexey Noskov · Andrey Ploskonosov · Liudmila Prokhorenkova · Ivan Provilkov · Vatsal Raina · Vyas Raina · Denis Roginskiy · Mariya Shmatova · Panagiotis Tigas · Boris Yangel -
2021 : Shifts: A Dataset of Real Distributional Shift Across Multiple Large-Scale Tasks »
Andrey Malinin -
2021 Poster: Scaling Ensemble Distribution Distillation to Many Classes with Proxy Targets »
Max Ryabinin · Andrey Malinin · Mark Gales -
2021 Poster: On the Periodic Behavior of Neural Network Training with Batch Normalization and Weight Decay »
Ekaterina Lobacheva · Maxim Kodryan · Nadezhda Chirkova · Andrey Malinin · Dmitry Vetrov -
2021 : Shifts Challenge: Robustness and Uncertainty under Real-World Distributional Shift + Q&A »
Andrey Malinin · Neil Band · German Chesnokov · Yarin Gal · Alexander Ganshin · Mark Gales · Alexey Noskov · Liudmila Prokhorenkova · Mariya Shmatova · Vyas Raina · Vatsal Raina · Panagiotis Tigas · Boris Yangel -
2018 Poster: Predictive Uncertainty Estimation via Prior Networks »
Andrey Malinin · Mark Gales