DOAJ Open Access 2024

A complex network approach to analyse pre-trained language models for ancient Chinese

Jianyu Zheng Xin'ge Xiao

Abstrak

Ancient Chinese is a splendid treasure within Chinese culture. To facilitate its compilation, pre-trained language models for ancient Chinese are developed. After that, researchers are actively exploring the factors contributing to their success. However, previous work did not study how language models organized the elements of ancient Chinese from a holistic perspective. Hence, we adopt complex networks to explore how language models organize the elements in ancient Chinese system. Specifically, we first analyse the characters’ and words’ co-occurrence networks in ancient Chinese. Then, we study characters’ and words’ attention networks, generated by attention heads within SikuBERT from two aspects: static and dynamic network analysis. In the static network analysis, we find that (i) most of attention networks exhibit small-world properties and scale-free behaviour, (ii) over 80% of attention networks exhibit high similarity with the corresponding co-occurrence networks, (iii) there exists a noticeable gap between characters’ and words’ attention networks across layers, while their fluctuations remain relatively consistent, and (iv) the attention networks generated by SikuBERT tend to be sparser compared with those from Chinese BERT. In dynamic network analysis, we find that the sentence segmentation task does not significantly affect network metrics, while the part-of-speech tagging task makes attention networks sparser.

Topik & Kata Kunci

Penulis (2)

J

Jianyu Zheng

X

Xin'ge Xiao

Format Sitasi

Zheng, J., Xiao, X. (2024). A complex network approach to analyse pre-trained language models for ancient Chinese. https://doi.org/10.1098/rsos.240061

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1098/rsos.240061
Informasi Jurnal
Tahun Terbit
2024
Sumber Database
DOAJ
DOI
10.1098/rsos.240061
Akses
Open Access ✓