Semantic Scholar Open Access 2021 232 sitasi

General Instance Distillation for Object Detection

Xing Dai Jiang Zhao Wu Yiping Bao Zhicheng Wang +2 lainnya

Abstrak

In recent years, knowledge distillation has been proved to be an effective solution for model compression. This approach can make lightweight student models acquire the knowledge extracted from cumbersome teacher models. However, previous distillation methods of detection have weak generalization for different detection frameworks and rely heavily on ground truth (GT), ignoring the valuable relation information between instances. Thus, we propose a novel distillation method for detection tasks based on discriminative instances without considering the positive or negative distinguished by GT, which is called general instance distillation (GID). Our approach contains a general instance selection module (GISM) to make full use of feature-based, relation-based and response-based knowledge for distillation. Extensive results demonstrate that the student model achieves significant AP improvement and even outperforms the teacher in various detection frame-works. Specifically, RetinaNet with ResNet-50 achieves 39.1% in mAP with GID on COCO dataset, which surpasses the baseline 36.2% by 2.9%, and even better than the ResNet-101 based teacher model with 38.1% AP.

Topik & Kata Kunci

Penulis (7)

X

Xing Dai

J

Jiang

Z

Zhao Wu

Y

Yiping Bao

Z

Zhicheng Wang

S

Sihan Liu

E

Erjin Zhou

Format Sitasi

Dai, X., Jiang, Wu, Z., Bao, Y., Wang, Z., Liu, S. et al. (2021). General Instance Distillation for Object Detection. https://doi.org/10.1109/CVPR46437.2021.00775

Akses Cepat

Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Total Sitasi
232×
Sumber Database
Semantic Scholar
DOI
10.1109/CVPR46437.2021.00775
Akses
Open Access ✓