arXiv Open Access 2021

Simple Distillation Baselines for Improving Small Self-supervised Models

Jindong Gu Wei Liu Yonglong Tian
Lihat Sumber

Abstrak

While large self-supervised models have rivalled the performance of their supervised counterparts, small models still struggle. In this report, we explore simple baselines for improving small self-supervised models via distillation, called SimDis. Specifically, we present an offline-distillation baseline, which establishes a new state-of-the-art, and an online-distillation baseline, which achieves similar performance with minimal computational overhead. We hope these baselines will provide useful experience for relevant future research. Code is available at: https://github.com/JindongGu/SimDis/

Topik & Kata Kunci

Penulis (3)

J

Jindong Gu

W

Wei Liu

Y

Yonglong Tian

Format Sitasi

Gu, J., Liu, W., Tian, Y. (2021). Simple Distillation Baselines for Improving Small Self-supervised Models. https://arxiv.org/abs/2106.11304

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓