arXiv Open Access 2022

BJTU-WeChat's Systems for the WMT22 Chat Translation Task

Yunlong Liang Fandong Meng Jinan Xu Yufeng Chen Jie Zhou
Lihat Sumber

Abstrak

This paper introduces the joint submission of the Beijing Jiaotong University and WeChat AI to the WMT'22 chat translation task for English-German. Based on the Transformer, we apply several effective variants. In our experiments, we utilize the pre-training-then-fine-tuning paradigm. In the first pre-training stage, we employ data filtering and synthetic data generation (i.e., back-translation, forward-translation, and knowledge distillation). In the second fine-tuning stage, we investigate speaker-aware in-domain data generation, speaker adaptation, prompt-based context modeling, target denoising fine-tuning, and boosted self-COMET-based model ensemble. Our systems achieve 0.810 and 0.946 COMET scores. The COMET scores of English-German and German-English are the highest among all submissions.

Topik & Kata Kunci

Penulis (5)

Y

Yunlong Liang

F

Fandong Meng

J

Jinan Xu

Y

Yufeng Chen

J

Jie Zhou

Format Sitasi

Liang, Y., Meng, F., Xu, J., Chen, Y., Zhou, J. (2022). BJTU-WeChat's Systems for the WMT22 Chat Translation Task. https://arxiv.org/abs/2211.15009

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓