Semantic Scholar Open Access 2015 1710 sitasi

Character-Aware Neural Language Models

Yoon Kim Yacine Jernite D. Sontag Alexander M. Rush

Abstrak

We describe a simple neural language model that relies only on character-level inputs. Predictions are still made at the word-level. Our model employs a convolutional neural network (CNN) and a highway net work over characters, whose output is given to a long short-term memory (LSTM) recurrent neural network language model (RNN-LM). On the English Penn Treebank the model is on par with the existing state-of-the-art despite having 60% fewer parameters. On languages with rich morphology (Arabic, Czech, French, German, Spanish, Russian), the model outperforms word-level/morpheme-level LSTM baselines, again with fewer parameters. The results suggest that on many languages, character inputs are sufficient for language modeling. Analysis of word representations obtained from the character composition part of the model reveals that the model is able to encode, from characters only, both semantic and orthographic information.

Penulis (4)

Y

Yoon Kim

Y

Yacine Jernite

D

D. Sontag

A

Alexander M. Rush

Format Sitasi

Kim, Y., Jernite, Y., Sontag, D., Rush, A.M. (2015). Character-Aware Neural Language Models. https://doi.org/10.1609/aaai.v30i1.10362

Akses Cepat

Lihat di Sumber doi.org/10.1609/aaai.v30i1.10362
Informasi Jurnal
Tahun Terbit
2015
Bahasa
en
Total Sitasi
1710×
Sumber Database
Semantic Scholar
DOI
10.1609/aaai.v30i1.10362
Akses
Open Access ✓