arXiv
Open Access
2023
Improving Knowledge Distillation via Transferring Learning Ability
Long Liu
Tong Li
Hui Cheng
Abstrak
Existing knowledge distillation methods generally use a teacher-student approach, where the student network solely learns from a well-trained teacher. However, this approach overlooks the inherent differences in learning abilities between the teacher and student networks, thus causing the capacity-gap problem. To address this limitation, we propose a novel method called SLKD.
Topik & Kata Kunci
Penulis (3)
L
Long Liu
T
Tong Li
H
Hui Cheng
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2023
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓