Semantic Scholar Open Access 2022 112 sitasi

PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient

Weihan Cao Yifan Zhang Jianfei Gao Anda Cheng Ke Cheng +1 lainnya

Abstrak

Knowledge distillation(KD) is a widely-used technique to train compact models in object detection. However, there is still a lack of study on how to distill between heterogeneous detectors. In this paper, we empirically find that better FPN features from a heterogeneous teacher detector can help the student although their detection heads and label assignments are different. However, directly aligning the feature maps to distill detectors suffers from two problems. First, the difference in feature magnitude between the teacher and the student could enforce overly strict constraints on the student. Second, the FPN stages and channels with large feature magnitude from the teacher model could dominate the gradient of distillation loss, which will overwhelm the effects of other features in KD and introduce much noise. To address the above issues, we propose to imitate features with Pearson Correlation Coefficient to focus on the relational information from the teacher and relax constraints on the magnitude of the features. Our method consistently outperforms the existing detection KD methods and works for both homogeneous and heterogeneous student-teacher pairs. Furthermore, it converges faster. With a powerful MaskRCNN-Swin detector as the teacher, ResNet-50 based RetinaNet and FCOS achieve 41.5% and 43.9% mAP on COCO2017, which are 4.1\% and 4.8\% higher than the baseline, respectively.

Topik & Kata Kunci

Penulis (6)

W

Weihan Cao

Y

Yifan Zhang

J

Jianfei Gao

A

Anda Cheng

K

Ke Cheng

J

Jian Cheng

Format Sitasi

Cao, W., Zhang, Y., Gao, J., Cheng, A., Cheng, K., Cheng, J. (2022). PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient. https://doi.org/10.48550/arXiv.2207.02039

Akses Cepat

Lihat di Sumber doi.org/10.48550/arXiv.2207.02039
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Total Sitasi
112×
Sumber Database
Semantic Scholar
DOI
10.48550/arXiv.2207.02039
Akses
Open Access ✓