arXiv Open Access 2020

DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis

Hu Xu Bing Liu Lei Shu Philip S. Yu
Lihat Sumber

Abstrak

This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis, demonstrating promising results.

Topik & Kata Kunci

Penulis (4)

H

Hu Xu

B

Bing Liu

L

Lei Shu

P

Philip S. Yu

Format Sitasi

Xu, H., Liu, B., Shu, L., Yu, P.S. (2020). DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis. https://arxiv.org/abs/2004.13816

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓