DOAJ Open Access 2021

D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction

Yuan Huang Zhixing Li Wei Deng Guoyin Wang Zhimin Lin

Abstrak

Abstract Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre‐trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high‐level syntactic features that consider the dependency between each word and the target entities into the pre‐trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi‐granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.

Penulis (5)

Y

Yuan Huang

Z

Zhixing Li

W

Wei Deng

G

Guoyin Wang

Z

Zhimin Lin

Format Sitasi

Huang, Y., Li, Z., Deng, W., Wang, G., Lin, Z. (2021). D‐BERT: Incorporating dependency‐based attention into BERT for relation extraction. https://doi.org/10.1049/cit2.12033

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1049/cit2.12033
Informasi Jurnal
Tahun Terbit
2021
Sumber Database
DOAJ
DOI
10.1049/cit2.12033
Akses
Open Access ✓