`

Timezone: »

 
Poster
RED : Looking for Redundancies for Data-FreeStructured Compression of Deep Neural Networks
Edouard YVINEC · Arnaud Dapogny · Matthieu Cord · Kevin Bailly

Thu Dec 09 04:30 PM -- 06:00 PM (PST) @

Deep Neural Networks (DNNs) are ubiquitous in today's computer vision landscape, despite involving considerable computational costs. The mainstream approaches for runtime acceleration consist in pruning connections (unstructured pruning) or, better, filters (structured pruning), both often requiring data to retrain the model. In this paper, we present RED, a data-free, unified approach to tackle structured pruning. First, we propose a novel adaptive hashing of the scalar DNN weight distribution densities to increase the number of identical neurons represented by their weight vectors. Second, we prune the network by merging redundant neurons based on their relative similarities, as defined by their distance. Third, we propose a novel uneven depthwise separation technique to further prune convolutional layers. We demonstrate through a large variety of benchmarks that RED largely outperforms other data-free pruning methods, often reaching performance similar to unconstrained, data-driven methods.

Author Information

Edouard YVINEC (Computer Science Lab - Pierre and Marie Curie University, Paris, France)
Arnaud Dapogny (LIP6)
Matthieu Cord (Sorbonne University)
Kevin Bailly (ISIR, UMR 7222)

More from the Same Authors