DOAJ Open Access 2022

Enhanced gradient learning for deep neural networks

Ming Yan Jianxi Yang Cen Chen Joey Tianyi Zhou Yi Pan +1 lainnya

Abstrak

Abstract Deep neural networks have achieved great success in both computer vision and natural language processing tasks. How to improve the gradient flows is crucial in training very deep neural networks. To address this challenge, a gradient enhancement approach is proposed through constructing the short circuit neural connections. The proposed short circuit is a unidirectional neural connection that back propagates the sensitivities rather than gradients in neural networks from the deep layers to the shallow layers. Moreover, the short circuit is further formulated as a gradient truncation operation in its connecting layers, which can be plugged into the backbone models without introducing extra training parameters. Extensive experiments demonstrate that the deep neural networks, with the help of short circuit connection, gain a large margin of improvement over the baselines on both computer vision and natural language processing tasks. The work provides the promising solution to the low‐resource scenarios, such as, intelligence transport systems of computer vision, question answering of natural language processing.

Penulis (6)

M

Ming Yan

J

Jianxi Yang

C

Cen Chen

J

Joey Tianyi Zhou

Y

Yi Pan

Z

Zeng Zeng

Format Sitasi

Yan, M., Yang, J., Chen, C., Zhou, J.T., Pan, Y., Zeng, Z. (2022). Enhanced gradient learning for deep neural networks. https://doi.org/10.1049/ipr2.12353

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1049/ipr2.12353
Informasi Jurnal
Tahun Terbit
2022
Sumber Database
DOAJ
DOI
10.1049/ipr2.12353
Akses
Open Access ✓