arXiv Open Access 2020

Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages

Machel Reid Edison Marrese-Taylor Yutaka Matsuo
Lihat Sumber

Abstrak

The contrast between the need for large amounts of data for current Natural Language Processing (NLP) techniques, and the lack thereof, is accentuated in the case of African languages, most of which are considered low-resource. To help circumvent this issue, we explore techniques exploiting the qualities of morphologically rich languages (MRLs), while leveraging pretrained word vectors in well-resourced languages. In our exploration, we show that a meta-embedding approach combining both pretrained and morphologically-informed word embeddings performs best in the downstream task of Xhosa-English translation.

Topik & Kata Kunci

Penulis (3)

M

Machel Reid

E

Edison Marrese-Taylor

Y

Yutaka Matsuo

Format Sitasi

Reid, M., Marrese-Taylor, E., Matsuo, Y. (2020). Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages. https://arxiv.org/abs/2003.04419

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓