arXiv Open Access 2023

Pre-training Multi-party Dialogue Models with Latent Discourse Inference

Yiyang Li Xinting Huang Wei Bi Hai Zhao
Lihat Sumber

Abstrak

Multi-party dialogues are more difficult for models to understand than one-to-one two-party dialogues, since they involve multiple interlocutors, resulting in interweaving reply-to relations and information flows. To step over these obstacles, an effective way is to pre-train a model that understands the discourse structure of multi-party dialogues, namely, to whom each utterance is replying. However, due to the lack of explicitly annotated discourse labels in multi-party dialogue corpora, previous works fail to scale up the pre-training process by putting aside the unlabeled multi-party conversational data for nothing. To fully utilize the unlabeled data, we propose to treat the discourse structures as latent variables, then jointly infer them and pre-train the discourse-aware model by unsupervised latent variable inference methods. Experiments on multiple downstream tasks show that our pre-trained model outperforms strong baselines by large margins and achieves state-of-the-art (SOTA) results, justifying the effectiveness of our method. The official implementation of this paper is available at https://github.com/EricLee8/MPD_EMVI.

Topik & Kata Kunci

Penulis (4)

Y

Yiyang Li

X

Xinting Huang

W

Wei Bi

H

Hai Zhao

Format Sitasi

Li, Y., Huang, X., Bi, W., Zhao, H. (2023). Pre-training Multi-party Dialogue Models with Latent Discourse Inference. https://arxiv.org/abs/2305.15175

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2023
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓