arXiv Open Access 2024

Rich Semantic Knowledge Enhanced Large Language Models for Few-shot Chinese Spell Checking

Ming Dong Yujing Chen Miao Zhang Hao Sun Tingting He
Lihat Sumber

Abstrak

Chinese Spell Checking (CSC) is a widely used technology, which plays a vital role in speech to text (STT) and optical character recognition (OCR). Most of the existing CSC approaches relying on BERT architecture achieve excellent performance. However, limited by the scale of the foundation model, BERT-based method does not work well in few-shot scenarios, showing certain limitations in practical applications. In this paper, we explore using an in-context learning method named RS-LLM (Rich Semantic based LLMs) to introduce large language models (LLMs) as the foundation model. Besides, we study the impact of introducing various Chinese rich semantic information in our framework. We found that by introducing a small number of specific Chinese rich semantic structures, LLMs achieve better performance than the BERT-based model on few-shot CSC task. Furthermore, we conduct experiments on multiple datasets, and the experimental results verified the superiority of our proposed framework.

Topik & Kata Kunci

Penulis (5)

M

Ming Dong

Y

Yujing Chen

M

Miao Zhang

H

Hao Sun

T

Tingting He

Format Sitasi

Dong, M., Chen, Y., Zhang, M., Sun, H., He, T. (2024). Rich Semantic Knowledge Enhanced Large Language Models for Few-shot Chinese Spell Checking. https://arxiv.org/abs/2403.08492

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓