Cross-Domain Aspect Term Extraction Fusing Global and Local Semantics
Abstrak
Aspect Term Extraction (ATE) is a critical task in aspect-level sentiment analysis, and extraction and annotation costs are extremely high. When training and testing samples come from different domains, the performance of traditional methods often degrades significantly owing to the differences between the two samples. Existing methods focus on domain adaptation techniques based on rich semantic information within local contexts to achieve cross-domain ATE. However, they overlook the potential global long-range dependency relationships of aspect terms within the text, thereby limiting the performance, scalability, and robustness of the models. To address these issues, this study proposes a cross-domain ATE model known as CBiLSTM, which does not require additional manual labeling and integrates global and local semantic information. The model leverages semantic information as a pivot and first incorporates external semantic information into word embeddings to construct pivot information for both the source and target domains. It then performs parallel encoding of the global and local contextual semantic information, thereby better capturing comprehensive semantic features and bridging the gap between the source and target domains to achieve cross-domain ATE. CBiLSTM achieves an average F1-score of 53.87%, outperforming the current state-of-the-art model by 0.49 percentage points, on three benchmark datasets. Experimental results demonstrate the superior performance and lower computational cost of CBiLSTM.
Topik & Kata Kunci
Penulis (1)
LIU Dage, YOU Jinguo, GENG Qiqi
Akses Cepat
- Tahun Terbit
- 2025
- Sumber Database
- DOAJ
- DOI
- 10.19678/j.issn.1000-3428.0069205
- Akses
- Open Access ✓