Semantic Scholar Open Access 2019 4822 sitasi

A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures

Yong Yu Xiaosheng Si Changhua Hu Jian-xun Zhang

Abstrak

Recurrent neural networks (RNNs) have been widely adopted in research areas concerned with sequential data, such as text, audio, and video. However, RNNs consisting of sigma cells or tanh cells are unable to learn the relevant information of input data when the input gap is large. By introducing gate functions into the cell structure, the long short-term memory (LSTM) could handle the problem of long-term dependencies well. Since its introduction, almost all the exciting results based on RNNs have been achieved by the LSTM. The LSTM has become the focus of deep learning. We review the LSTM cell and its variants to explore the learning capacity of the LSTM cell. Furthermore, the LSTM networks are divided into two broad categories: LSTM-dominated networks and integrated LSTM networks. In addition, their various applications are discussed. Finally, future research directions are presented for LSTM networks.

Penulis (4)

Y

Yong Yu

X

Xiaosheng Si

C

Changhua Hu

J

Jian-xun Zhang

Format Sitasi

Yu, Y., Si, X., Hu, C., Zhang, J. (2019). A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. https://doi.org/10.1162/neco_a_01199

Akses Cepat

Lihat di Sumber doi.org/10.1162/neco_a_01199
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Total Sitasi
4822×
Sumber Database
Semantic Scholar
DOI
10.1162/neco_a_01199
Akses
Open Access ✓