arXiv Open Access 2024

Aligning Translation-Specific Understanding to General Understanding in Large Language Models

Yichong Huang Baohang Li Xiaocheng Feng Chengpeng Fu Wenshuai Huo +2 lainnya
Lihat Sumber

Abstrak

Large Language models (LLMs) have exhibited remarkable abilities in understanding complex texts, offering a promising path towards human-like translation performance. However, this study reveals the misalignment between the translation-specific understanding and the general understanding inside LLMs. This understanding misalignment leads to LLMs mistakenly or literally translating some complicated concepts that they accurately comprehend in the general scenarios (e.g., QA). To align the translation-specific understanding to the general one, we propose a novel translation process, DUAT (Difficult words Understanding Aligned Translation), explicitly incorporating the general understanding on the complicated content incurring inconsistent understanding to guide the translation. Specifically, DUAT performs cross-lingual interpretation for the difficult-to-translate words and enhances the translation with the generated interpretations. Furthermore, we reframe the external tools to improve DUAT in detecting difficult words and generating helpful interpretations. We conduct experiments on the self-constructed benchmark Challenge-WMT, consisting of samples that are prone to mistranslation. Human evaluation results on high-resource and low-resource language pairs indicate that DUAT significantly facilitates the understanding alignment, which improves the translation quality (up to +3.85 COMET) and reduces the literality of the translation by -25% to -51%.

Topik & Kata Kunci

Penulis (7)

Y

Yichong Huang

B

Baohang Li

X

Xiaocheng Feng

C

Chengpeng Fu

W

Wenshuai Huo

T

Ting Liu

B

Bing Qin

Format Sitasi

Huang, Y., Li, B., Feng, X., Fu, C., Huo, W., Liu, T. et al. (2024). Aligning Translation-Specific Understanding to General Understanding in Large Language Models. https://arxiv.org/abs/2401.05072

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓