arXiv Open Access 2020

Margin-Based Regularization and Selective Sampling in Deep Neural Networks

Berry Weinstein Shai Fine Yacov Hel-Or
Lihat Sumber

Abstrak

We derive a new margin-based regularization formulation, termed multi-margin regularization (MMR), for deep neural networks (DNNs). The MMR is inspired by principles that were applied in margin analysis of shallow linear classifiers, e.g., support vector machine (SVM). Unlike SVM, MMR is continuously scaled by the radius of the bounding sphere (i.e., the maximal norm of the feature vector in the data), which is constantly changing during training. We empirically demonstrate that by a simple supplement to the loss function, our method achieves better results on various classification tasks across domains. Using the same concept, we also derive a selective sampling scheme and demonstrate accelerated training of DNNs by selecting samples according to a minimal margin score (MMS). This score measures the minimal amount of displacement an input should undergo until its predicted classification is switched. We evaluate our proposed methods on three image classification tasks and six language text classification tasks. Specifically, we show improved empirical results on CIFAR10, CIFAR100 and ImageNet using state-of-the-art convolutional neural networks (CNNs) and BERT-BASE architecture for the MNLI, QQP, QNLI, MRPC, SST-2 and RTE benchmarks.

Topik & Kata Kunci

Penulis (3)

B

Berry Weinstein

S

Shai Fine

Y

Yacov Hel-Or

Format Sitasi

Weinstein, B., Fine, S., Hel-Or, Y. (2020). Margin-Based Regularization and Selective Sampling in Deep Neural Networks. https://arxiv.org/abs/2009.06011

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓