arXiv Open Access 2022

Replacing Language Model for Style Transfer

Pengyu Cheng Ruineng Li
Lihat Sumber

Abstrak

We introduce replacing language model (RLM), a sequence-to-sequence language modeling framework for text style transfer (TST). Our method autoregressively replaces each token of the source sentence with a text span that has a similar meaning but in the target style. The new span is generated via a non-autoregressive masked language model, which can better preserve the local-contextual meaning of the replaced token. This RLM generation scheme gathers the flexibility of autoregressive models and the accuracy of non-autoregressive models, which bridges the gap between sentence-level and word-level style transfer methods. To control the generation style more precisely, we conduct a token-level style-content disentanglement on the hidden representations of RLM. Empirical results on real-world text datasets demonstrate the effectiveness of RLM compared with other TST baselines. The code is at https://github.com/Linear95/RLM.

Topik & Kata Kunci

Penulis (2)

P

Pengyu Cheng

R

Ruineng Li

Format Sitasi

Cheng, P., Li, R. (2022). Replacing Language Model for Style Transfer. https://arxiv.org/abs/2211.07343

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓