Timezone: »
Many self-supervised methods have been proposed with the target of image anomaly detection. These methods often rely on the paradigm of data augmentation with predefined transformations such as flipping, cropping, and rotations. However, it is not straightforward to apply these techniques for non-image data, such as time series or tabular data, while the performance of the existing deep approaches has been under our expectation on tasks beyond images. In this work, we propose a novel active learning (AL) scheme that relied on neural autoregressive flows (NAF) for self-supervised anomaly detection, specifically on small-scale data. Unlike other generative models such as GANs or VAEs, flow-based models allow to explicitly learn the probability density and thus can assign accurate likelihoods to normal data which makes it usable to detect anomalies. The proposed NAF-AL method is achieved by efficiently generating random samples from latent space and transforming them into feature space along with likelihoods via invertible mapping. The samples with lower likelihoods are selected and further checked by outlier detection using Mahalanobis distance. The augmented samples incorporating with normal samples are used for training a better detector so as to approach decision boundaries. Compared with random transformations, NAF-AL can be interpreted as a likelihood-oriented data augmentation that is more efficient and robust. Extensive experiments show that our approach outperforms existing baselines on multiple time series and tabular datasets, and a real-world application in advanced manufacturing, with significant improvement on anomaly detection accuracy and robustness over the state-of-the-art.
Author Information
Jiaxin Zhang (Oak Ridge National Laboratory)
I am now a Research Staff in Machine Learning and Data Analytics Group, Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL). My current research interest is on Artificial Intelligence for Science and Engineering (AISE). My broad interests revolve around robust machine learning, uncertainty quantification, inverse problems, and numerical optimization.
Kyle Saleeby
Thomas Feldhausen
Sirui Bi (Oak Ridge National Laboratory)
Alex Plotkowski
David Womble
Related Events (a corresponding poster, oral, or spotlight)
-
2021 : Self-Supervised Anomaly Detection via Neural Autoregressive Flows with Active Learning »
Dates n/a. Room
More from the Same Authors
-
2020 : A Nonlocal-Gradient Descent Method for Inverse Design in Nanophotonics »
Sirui Bi · Jiaxin Zhang · Guannan Zhang -
2020 : Scalable Deep-Learning-Accelerated Topology Optimization for Additively Manufactured Materials »
Sirui Bi · Jiaxin Zhang · Guannan Zhang -
2021 : Machine learning-enabled model-data integration for predicting subsurface water storage »
Dan Lu · Eric Pierce · Shih-Chieh Kao · David Womble · Li Li · Daniella Rempe -
2021 Poster: On the Stochastic Stability of Deep Markov Models »
Jan Drgona · Sayak Mukherjee · Jiaxin Zhang · Frank Liu · Mahantesh Halappanavar -
2020 : 9 - Thermodynamic Consistent Neural Networks for Learning Material Interfacial Mechanics »
Jiaxin Zhang -
2019 Poster: Learning nonlinear level sets for dimensionality reduction in function approximation »
Guannan Zhang · Jiaxin Zhang · Jacob Hinkle