Semantic Scholar Open Access 2019 395 sitasi

Semantics-aware BERT for Language Understanding

Zhuosheng Zhang Yuwei Wu Zhao Hai Z. Li Shuailiang Zhang +2 lainnya

Abstrak

The latest work on language representations carefully integrates contextualized features into language model training, which enables a series of success especially in various machine reading comprehension and natural language inference tasks. However, the existing language representation models including ELMo, GPT and BERT only exploit plain context-sensitive features such as character or word embeddings. They rarely consider incorporating structured semantic information which can provide rich semantics for language representation. To promote natural language understanding, we propose to incorporate explicit contextual semantics from pre-trained semantic role labeling, and introduce an improved language representation model, Semantics-aware BERT (SemBERT), which is capable of explicitly absorbing contextual semantics over a BERT backbone. SemBERT keeps the convenient usability of its BERT precursor in a light fine-tuning way without substantial task-specific modifications. Compared with BERT, semantics-aware BERT is as simple in concept but more powerful. It obtains new state-of-the-art or substantially improves results on ten reading comprehension and language inference tasks.

Topik & Kata Kunci

Penulis (7)

Z

Zhuosheng Zhang

Y

Yuwei Wu

Z

Zhao Hai

Z

Z. Li

S

Shuailiang Zhang

X

Xi Zhou

X

Xiang Zhou

Format Sitasi

Zhang, Z., Wu, Y., Hai, Z., Li, Z., Zhang, S., Zhou, X. et al. (2019). Semantics-aware BERT for Language Understanding. https://doi.org/10.1609/AAAI.V34I05.6510

Akses Cepat

Lihat di Sumber doi.org/10.1609/AAAI.V34I05.6510
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Total Sitasi
395×
Sumber Database
Semantic Scholar
DOI
10.1609/AAAI.V34I05.6510
Akses
Open Access ✓