Timezone: »

 
Poster
S$^3$: Sign-Sparse-Shift Reparametrization for Effective Training of Low-bit Shift Networks
Xinlin Li · Bang Liu · Yaoliang Yu · Wulong Liu · Chunjing XU · Vahid Partovi Nia

Tue Dec 07 08:30 AM -- 10:00 AM (PST) @
Shift neural networks reduce computation complexity by removing expensive multiplication operations and quantizing continuous weights into low-bit discrete values, which are fast and energy-efficient compared to conventional neural networks. However, existing shift networks are sensitive to the weight initialization and yield a degraded performance caused by vanishing gradient and weight sign freezing problem. To address these issues, we propose S$^3$ re-parameterization, a novel technique for training low-bit shift networks. Our method decomposes a discrete parameter in a sign-sparse-shift 3-fold manner. This way, it efficiently learns a low-bit network with weight dynamics similar to full-precision networks and insensitive to weight initialization. Our proposed training method pushes the boundaries of shift neural networks and shows 3-bit shift networks compete with their full-precision counterparts in terms of top-1 accuracy on ImageNet.

Author Information

Xinlin Li (Huawei Noah's Ark Lab)
Bang Liu (University of Montreal)
Yaoliang Yu (Carnegie Mellon University)
Wulong Liu (Huawei Noah's Ark Lab)
Chunjing XU (Huawei Technologies)
Vahid Partovi Nia (Huawei Noah's Ark Lab)

More from the Same Authors