Timezone: »
Binary Neural Network (BNN) shows its predominance in reducing the complexity of deep neural networks. However, it suffers severe performance degradation. One of the major impediments is the large quantization error between the full-precision weight vector and its binary vector. Previous works focus on compensating for the norm gap while leaving the angular bias hardly touched. In this paper, for the first time, we explore the influence of angular bias on the quantization error and then introduce a Rotated Binary Neural Network (RBNN), which considers the angle alignment between the full-precision weight vector and its binarized version. At the beginning of each training epoch, we propose to rotate the full-precision weight vector to its binary vector to reduce the angular bias. To avoid the high complexity of learning a large rotation matrix, we further introduce a bi-rotation formulation that learns two smaller rotation matrices. In the training stage, we devise an adjustable rotated weight vector for binarization to escape the potential local optimum. Our rotation leads to around 50% weight flips which maximize the information gain. Finally, we propose a training-aware approximation of the sign function for the gradient backward. Experiments on CIFAR-10 and ImageNet demonstrate the superiorities of RBNN over many state-of-the-arts. Our source code, experimental settings, training logs and binary models are available at https://github.com/lmbxmu/RBNN.
Author Information
Mingbao Lin (Xiamen University)
Rongrong Ji (Xiamen University, China)
Zihan Xu (Xiamen University, China)
Baochang Zhang (Beihang University)
Yan Wang (Pinterest)
Yongjian Wu (Tencent Technology (Shanghai) Co.,Ltd)
Feiyue Huang (Tencent)
Chia-Wen Lin (National Tsing Hua University)
More from the Same Authors
-
2022 Poster: Make Sharpness-Aware Minimization Stronger: A Sparsified Perturbation Approach »
Peng Mi · Li Shen · Tianhe Ren · Yiyi Zhou · Xiaoshuai Sun · Rongrong Ji · Dacheng Tao -
2022 Poster: FNeVR: Neural Volume Rendering for Face Animation »
Bohan Zeng · Boyu Liu · Hong Li · Xuhui Liu · Jianzhuang Liu · Dapeng Chen · Wei Peng · Baochang Zhang -
2022 Poster: PyramidCLIP: Hierarchical Feature Alignment for Vision-language Model Pretraining »
Yuting Gao · Jinfeng Liu · Zihan Xu · Jun Zhang · Ke Li · Rongrong Ji · Chunhua Shen -
2022 Poster: Learning Best Combination for Efficient N:M Sparsity »
Yuxin Zhang · Mingbao Lin · ZhiHang Lin · Yiting Luo · Ke Li · Fei Chao · Yongjian Wu · Rongrong Ji -
2022 Poster: Q-ViT: Accurate and Fully Quantized Low-bit Vision Transformer »
Yanjing Li · Sheng Xu · Baochang Zhang · Xianbin Cao · Peng Gao · Guodong Guo -
2021 Poster: Analogous to Evolutionary Algorithm: Designing a Unified Sequence Model »
Jiangning Zhang · Chao Xu · Jian Li · Wenzhou Chen · Yabiao Wang · Ying Tai · Shuo Chen · Chengjie Wang · Feiyue Huang · Yong Liu -
2021 Poster: Dual-stream Network for Visual Recognition »
Mingyuan Mao · peng gao · Renrui Zhang · Honghui Zheng · Teli Ma · Yan Peng · Errui Ding · Baochang Zhang · Shumin Han -
2020 Poster: UWSOD: Toward Fully-Supervised-Level Capacity Weakly Supervised Object Detection »
Yunhang Shen · Rongrong Ji · Zhiwei Chen · Yongjian Wu · Feiyue Huang -
2019 Poster: Variational Structured Semantic Inference for Diverse Image Captioning »
Fuhai Chen · Rongrong Ji · Jiayi Ji · Xiaoshuai Sun · Baochang Zhang · Xuri Ge · Yongjian Wu · Feiyue Huang · Yan Wang -
2019 Poster: FreeAnchor: Learning to Match Anchors for Visual Object Detection »
Xiaosong Zhang · Fang Wan · Chang Liu · Rongrong Ji · Qixiang Ye -
2019 Poster: Information Competing Process for Learning Diversified Representations »
Jie Hu · Rongrong Ji · ShengChuan Zhang · Xiaoshuai Sun · Qixiang Ye · Chia-Wen Lin · Qi Tian