Timezone: »

 
Engineering Uncertainty Representations to Monitor Distribution Shifts
Thomas Bonnier · Benjamin Bosch
Event URL: https://openreview.net/forum?id=Koug1i2HpH »

In some classification tasks, the true label is not known until months or even years after the classifier prediction time. Once the model has been deployed, harmful dataset shift regimes can surface. Without cautious model monitoring, the damage could prove to be irreversible when true labels unfold. In this paper, we propose a method for practitioners to monitor distribution shifts on unlabeled data. We leverage two representations for quantifying and visualizing model uncertainty. The Adversarial Neighborhood Analysis assesses model uncertainty by aggregating predictions in the neighborhood of a data point and comparing them to the prediction at the single point. The Non-Conformity Analysis exploits the results of conformal prediction and leverages a decision tree to display uncertain zones. We empirically test our approach over scenarios of synthetically generated shifts to prove its efficacy.

Author Information

Thomas Bonnier (Société Générale)
Benjamin Bosch (Société Générale)

More from the Same Authors

  • 2022 : Completing the Model Optimization Process by Correcting Patterns of Failure in Regression Tasks »
    Thomas Bonnier
  • 2022 : Poster Session 1 »
    Andrew Lowy · Thomas Bonnier · Yiling Xie · Guy Kornowski · Simon Schug · Seungyub Han · Nicolas Loizou · xinwei zhang · Laurent Condat · Tabea E. Röber · Si Yi Meng · Marco Mondelli · Runlong Zhou · Eshaan Nichani · Adrian Goldwaser · Rudrajit Das · Kayhan Behdin · Atish Agarwala · Mukul Gagrani · Gary Cheng · Tian Li · Haoran Sun · Hossein Taheri · Allen Liu · Siqi Zhang · Dmitrii Avdiukhin · Bradley Brown · Miaolan Xie · Junhyung Lyle Kim · Sharan Vaswani · Xinmeng Huang · Ganesh Ramachandra Kini · Angela Yuan · Weiqiang Zheng · Jiajin Li