arXiv
Open Access
2021
Simple Distillation Baselines for Improving Small Self-supervised Models
Jindong Gu
Wei Liu
Yonglong Tian
Abstrak
While large self-supervised models have rivalled the performance of their supervised counterparts, small models still struggle. In this report, we explore simple baselines for improving small self-supervised models via distillation, called SimDis. Specifically, we present an offline-distillation baseline, which establishes a new state-of-the-art, and an online-distillation baseline, which achieves similar performance with minimal computational overhead. We hope these baselines will provide useful experience for relevant future research. Code is available at: https://github.com/JindongGu/SimDis/
Topik & Kata Kunci
Penulis (3)
J
Jindong Gu
W
Wei Liu
Y
Yonglong Tian
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2021
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓