arXiv Open Access 2020

Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders

Yanbin Zhao Lu Chen Zhi Chen Kai Yu
Lihat Sumber

Abstrak

Text simplification (TS) rephrases long sentences into simplified variants while preserving inherent semantics. Traditional sequence-to-sequence models heavily rely on the quantity and quality of parallel sentences, which limits their applicability in different languages and domains. This work investigates how to leverage large amounts of unpaired corpora in TS task. We adopt the back-translation architecture in unsupervised machine translation (NMT), including denoising autoencoders for language modeling and automatic generation of parallel data by iterative back-translation. However, it is non-trivial to generate appropriate complex-simple pair if we directly treat the set of simple and complex corpora as two different languages, since the two types of sentences are quite similar and it is hard for the model to capture the characteristics in different types of sentences. To tackle this problem, we propose asymmetric denoising methods for sentences with separate complexity. When modeling simple and complex sentences with autoencoders, we introduce different types of noise into the training process. Such a method can significantly improve the simplification performance. Our model can be trained in both unsupervised and semi-supervised manner. Automatic and human evaluations show that our unsupervised model outperforms the previous systems, and with limited supervision, our model can perform competitively with multiple state-of-the-art simplification systems.

Topik & Kata Kunci

Penulis (4)

Y

Yanbin Zhao

L

Lu Chen

Z

Zhi Chen

K

Kai Yu

Format Sitasi

Zhao, Y., Chen, L., Chen, Z., Yu, K. (2020). Semi-Supervised Text Simplification with Back-Translation and Asymmetric Denoising Autoencoders. https://arxiv.org/abs/2004.14693

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓