Skip to yearly menu bar Skip to main content


Poster
in
Workshop: I Can’t Believe It’s Not Better: Understanding Deep Learning Through Empirical Falsification

On the Sparsity of Image Super-resolution Network

Chenyu Dong · Hailong Ma · Jinjin Gu · Ruofan Zhang · Jieming Li · Chun Yuan


Abstract:

The over parameterization of neural networks has been widely concerned for a long time. This gives us the opportunity to find a sub-networks that can improve the parameter efficiency of neural networks from a over parameterized network. In our study, we used EDSR as the backbone network to explore the parameter efficiency in super-resolution(SR) networks in the form of sparsity. Specifically, we search for sparse sub-networks at the two granularity of weight and kernel through various methods, and analyze the relationship between the structure and performance of the sub-networks. (1) We observe the ``Lottery Ticket Hypothesis'' from a new perspective in the regression task of SR on weight granularity. (2) On convolution kernel granularity, we apply several methods to explore the influence of different sparse sub-networks on network performance and found that based on certain rules, the performance of different sub-networks rarely depends on their structures. (3) We propose a very convenient width-sparsity method on convolution kernel granularity, which can improve the parameter utilization efficiency of most SR networks.

Chat is not available.