Skip to yearly menu bar Skip to main content


Poster

Making Sense of Dependence: Efficient Black-box Explanations Using Dependence Measure

Paul Novello · Thomas FEL · David Vigouroux

Hall J (level 1) #340

Keywords: [ machine learning ] [ explainability ] [ interpretability ] [ Computer Vision ] [ Kernel Methods ] [ dependence measure ] [ sensitivity analysis ] [ black box ] [ Deep Learning ]


Abstract:

This paper presents a new efficient black-box attribution method built on Hilbert-Schmidt Independence Criterion (HSIC). Based on Reproducing Kernel Hilbert Spaces (RKHS), HSIC measures the dependence between regions of an input image and the output of a model using the kernel embedding of their distributions. It thus provides explanations enriched by RKHS representation capabilities. HSIC can be estimated very efficiently, significantly reducing the computational cost compared to other black-box attribution methods.Our experiments show that HSIC is up to 8 times faster than the previous best black-box attribution methods while being as faithful.Indeed, we improve or match the state-of-the-art of both black-box and white-box attribution methods for several fidelity metrics on Imagenet with various recent model architectures.Importantly, we show that these advances can be transposed to efficiently and faithfully explain object detection models such as YOLOv4. Finally, we extend the traditional attribution methods by proposing a new kernel enabling an ANOVA-like orthogonal decomposition of importance scores based on HSIC, allowing us to evaluate not only the importance of each image patch but also the importance of their pairwise interactions. Our implementation is available at \url{https://github.com/paulnovello/HSIC-Attribution-Method}.

Chat is not available.