Timezone: »

Learning Frequency Domain Approximation for Binary Neural Networks
Yixing Xu · Kai Han · Chang Xu · Yehui Tang · Chunjing XU · Yunhe Wang

Tue Dec 07 04:30 PM -- 06:00 PM (PST) @

Binary neural networks (BNNs) represent original full-precision weights and activations into 1-bit with sign function. Since the gradient of the conventional sign function is almost zero everywhere which cannot be used for back-propagation, several attempts have been proposed to alleviate the optimization difficulty by using approximate gradient. However, those approximations corrupt the main direction of factual gradient. To this end, we propose to estimate the gradient of sign function in the Fourier frequency domain using the combination of sine functions for training BNNs, namely frequency domain approximation (FDA). The proposed approach does not affect the low-frequency information of the original sign function which occupies most of the overall energy, and high-frequency coefficients will be ignored to avoid the huge computational overhead. In addition, we embed a noise adaptation module into the training phase to compensate the approximation error. The experiments on several benchmark datasets and neural architectures illustrate that the binary network learned using our method achieves the state-of-the-art accuracy. Code will be available at https://gitee.com/mindspore/models/tree/master/research/cv/FDA-BNN.

Author Information

Yixing Xu (Huawei Noah's Ark Lab)
Kai Han (Huawei Noah's Ark Lab)
Chang Xu (The University of Sydney)
Yehui Tang (Peking University)
Chunjing XU (Huawei Technologies)
Yunhe Wang (Huawei Noah's Ark Lab)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors