arXiv Open Access 2021

Reduced Softmax Unit for Deep Neural Network Accelerators

Raghuram S
Lihat Sumber

Abstrak

The Softmax activation layer is a very popular Deep Neural Network (DNN) component when dealing with multi-class prediction problems. However, in DNN accelerator implementations it creates additional complexities due to the need for computation of the exponential for each of its inputs. In this brief we propose a simplified version of the activation unit for accelerators, where only a comparator unit produces the classification result, by choosing the maximum among its inputs. Due to the nature of the activation function, we show that this result is always identical to the classification produced by the Softmax layer.

Topik & Kata Kunci

Penulis (1)

R

Raghuram S

Format Sitasi

S, R. (2021). Reduced Softmax Unit for Deep Neural Network Accelerators. https://arxiv.org/abs/2201.04562

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓