arXiv Open Access 2023

Improving Knowledge Distillation via Transferring Learning Ability

Long Liu Tong Li Hui Cheng
Lihat Sumber

Abstrak

Existing knowledge distillation methods generally use a teacher-student approach, where the student network solely learns from a well-trained teacher. However, this approach overlooks the inherent differences in learning abilities between the teacher and student networks, thus causing the capacity-gap problem. To address this limitation, we propose a novel method called SLKD.

Topik & Kata Kunci

Penulis (3)

L

Long Liu

T

Tong Li

H

Hui Cheng

Format Sitasi

Liu, L., Li, T., Cheng, H. (2023). Improving Knowledge Distillation via Transferring Learning Ability. https://arxiv.org/abs/2304.11923

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2023
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓