Skip to yearly menu bar Skip to main content


Poster

Dynamic Network Surgery for Efficient DNNs

Yiwen Guo · Anbang Yao · Yurong Chen

Area 5+6+7+8 #146

Keywords: [ Model Selection and Structure Learning ] [ Large Scale Learning and Big Data ] [ (Application) Object and Pattern Recognition ] [ Deep Learning or Neural Networks ]


Abstract: Deep learning has become a ubiquitous technology to improve machine intelligence. However, most of the existing deep models are structurally very complex, making them difficult to be deployed on the mobile platforms with limited computational power. In this paper, we propose a novel network compression method called dynamic network surgery, which can remarkably reduce the network complexity by making on-the-fly connection pruning. Unlike the previous methods which accomplish this task in a greedy way, we properly incorporate connection splicing into the whole process to avoid incorrect pruning and make it as a continual network maintenance. The effectiveness of our method is proved with experiments. Without any accuracy loss, our method can efficiently compress the number of parameters in LeNet-5 and AlexNet by a factor of $\bm{108}\times$ and $\bm{17.7}\times$ respectively, proving that it outperforms the recent pruning method by considerable margins. Code and some models are available at https://github.com/yiwenguo/Dynamic-Network-Surgery.

Live content is unavailable. Log in and register to view live content