Timezone: »
Spotlight
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient
Weihan Cao · Yifan Zhang · Jianfei Gao · Anda Cheng · Ke Cheng · Jian Cheng
Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. However, there is still a lack of study on how to distill between heterogeneous detectors. In this paper, we empirically find that better FPN features from a heterogeneous teacher detector can help the student although their detection heads and label assignments are different. However, directly aligning the feature maps to distill detectors suffers from two problems. First, the difference in feature magnitude between the teacher and the student could enforce overly strict constraints on the student. Second, the FPN stages and channels with large feature magnitude from the teacher model could dominate the gradient of distillation loss, which will overwhelm the effects of other features in KD and introduce much noise. To address the above issues, we propose to imitate features with Pearson Correlation Coefficient to focus on the relational information from the teacher and relax constraints on the magnitude of the features. Our method consistently outperforms the existing detection KD methods and works for both homogeneous and heterogeneous student-teacher pairs. Furthermore, it converges faster. With a powerful MaskRCNN-Swin detector as the teacher, ResNet-50 based RetinaNet and FCOS achieve 41.5% and 43.9% $mAP$ on COCO2017, which are 4.1% and 4.8% higher than the baseline, respectively.
Author Information
Weihan Cao (Institute of Automation Chinese Academy of Sciences)
Yifan Zhang (Institute of automation, Chinese academy of science, Chinese Academy of Sciences)
Jianfei Gao (Southeast University)
Anda Cheng (Institute of automation, Chinese academy of science, Chinese Academy of Sciences)
Ke Cheng (Institute of automation, Chinese academy of science, Chinese Academy of Sciences)
Jian Cheng (Institute of Automation, Chinese Academy of Sciences)
Related Events (a corresponding poster, oral, or spotlight)
-
2022 Poster: PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient »
Dates n/a. Room
More from the Same Authors
-
2022 Poster: GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks »
Xingting Yao · Fanrong Li · Zitao Mo · Jian Cheng -
2022 Spotlight: Lightning Talks 2B-2 »
Chenjian Gao · Rui Ding · Lingzhi LI · Fan Yang · Xingting Yao · Jianxin Li · Bing Su · Zhen Shen · Tongda Xu · Shuai Zhang · Ji-Rong Wen · Lin Guo · Fanrong Li · Kehua Guo · Zhongshu Wang · Zhi Chen · Xiangyuan Zhu · Zitao Mo · Dailan He · Hui Xiong · Yan Wang · Zheng Wu · Wenbing Tao · Jian Cheng · Haoyi Zhou · Li Shen · Ping Tan · Liwei Wang · Hongwei Qin -
2022 Spotlight: GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks »
Xingting Yao · Fanrong Li · Zitao Mo · Jian Cheng -
2022 Poster: Singular Value Fine-tuning: Few-shot Segmentation requires Few-parameters Fine-tuning »
Yanpeng Sun · Qiang Chen · Xiangyu He · Jian Wang · Haocheng Feng · Junyu Han · Errui Ding · Jian Cheng · Zechao Li · Jingdong Wang -
2020 Poster: Revisiting Parameter Sharing for Automatic Neural Channel Number Search »
Jiaxing Wang · Haoli Bai · Jiaxiang Wu · Xupeng Shi · Junzhou Huang · Irwin King · Michael R Lyu · Jian Cheng -
2019 : MicroNet Challenge »
Peisong Wang · Cong Leng · Jian Cheng · Zhongxia Yan · Hanrui Wang · Trevor Gale · Erich Elsen