Semantic Scholar Open Access 2020 43 sitasi

Deep Self-Supervised Representation Learning for Free-Hand Sketch

Peng Xu Zeyu Song Qiyue Yin Yi-Zhe Song Liang Wang

Abstrak

In this paper, we tackle for the first time, the problem of self-supervised representation learning for free-hand sketches. This importantly addresses a common problem faced by the sketch community – that annotated supervisory data are difficult to obtain. This problem is very challenging in which sketches are highly abstract and subject to different drawing styles, making existing solutions tailored for photos unsuitable. Key for the success of our self-supervised learning paradigm lies with our sketch-specific designs: (i) we propose a set of pretext tasks specifically designed for sketches that mimic different drawing styles, and (ii) we further exploit the use of the textual convolution network (TCN) together with the convolutional neural network (CNN) in a dual-branch architecture for sketch feature learning, as means to accommodate the sequential stroke nature of sketches. We demonstrate the superiority of our sketch-specific designs through two sketch-related applications (retrieval and recognition) on a million-scale sketch dataset, and show that the proposed approach outperforms the state-of-the-art unsupervised representation learning methods, and significantly narrows the performance gap between with supervised representation learning.11PyTorch code of this work is available at https://github.com/zzz1515151/self-supervised_learning_sketch.

Topik & Kata Kunci

Penulis (5)

P

Peng Xu

Z

Zeyu Song

Q

Qiyue Yin

Y

Yi-Zhe Song

L

Liang Wang

Format Sitasi

Xu, P., Song, Z., Yin, Q., Song, Y., Wang, L. (2020). Deep Self-Supervised Representation Learning for Free-Hand Sketch. https://doi.org/10.1109/TCSVT.2020.3003048

Akses Cepat

Lihat di Sumber doi.org/10.1109/TCSVT.2020.3003048
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Total Sitasi
43×
Sumber Database
Semantic Scholar
DOI
10.1109/TCSVT.2020.3003048
Akses
Open Access ✓