Poster
in
Workshop: Challenges in Deploying and Monitoring Machine Learning Systems

AutoSlicer: Scalable Automated Data Slicing for ML Model Analysis

Zifan Liu · Evan Rosen · Paul Suganthan


Abstract:

Automated slicing aims to identify subsets of evaluation data where a trained model performs anomalously. This is an important problem for machine learning pipelines in production since it plays a key role in model debugging and comparison, as well as the diagnosis of fairness issues. Scalability has become a critical requirement for any automated slicing system due to the large search space of possible slices and the growing scale of data. We present AutoSlicer, a scalable system that searches for problematic slices through distributed metric computation and hypothesis testing. We develop an efficient strategy that reduces the search space through pruning and prioritization. In the experiments, we show that our search strategy finds most of the anomalous slices by inspecting a small portion of the search space.

Chat is not available.