Timezone: »

ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions
Hongyang Gao · Zhengyang Wang · Shuiwang Ji

Tue Dec 04 07:45 AM -- 09:45 AM (PST) @ Room 210 #66

Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which replace dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets. ChannelNets use three instances of channel-wise convolutions; namely group channel-wise convolutions, depth-wise separable channel-wise convolutions, and the convolutional classification layer. Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy. Notably, our work represents the first attempt to compress the fully-connected classification layer, which usually accounts for about 25% of total parameters in compact CNNs. Experimental results on the ImageNet dataset demonstrate that ChannelNets achieve consistently better performance compared to prior methods.

Author Information

Hongyang Gao (Texas A&M University)
Zhengyang Wang (Texas A&M University)
Shuiwang Ji (Texas A&M University)