YADI XU and Noor Farizah Ibrahim
Adv. Artif. Intell. Mach. Learn., 4 (3):2468-2489
YADI XU : School of Computer Science, Universiti Sains Malaysia
Noor Farizah Ibrahim : School of Computer Science, Universiti Sains Malaysia
DOI: https://dx.doi.org/10.54364/AAIML.2024.43144
Article History: Received on: 27-Jun-24, Accepted on: 13-Aug-24, Published on: 20-Aug-24
Corresponding Author: YADI XU
Email: yadixu@student.usm.my
Citation: YADI XU, Noor Farizah Ibrahim. (2024). Improving Cross-Domain Aspect-Based Sentiment Analysis using Bert-BiLSTM Model and Dual Attention Mechanism. Adv. Artif. Intell. Mach. Learn., 4 (3 ):2468-2489.
Data across different domains can be influenced by variations in
language styles and expressions, making it challenging to migrate specialized
words, particularly when focusing on aspectual words. This complexity poses
difficulties in conducting cross-domain aspect-based sentiment analysis. The
article begins by introducing BERT for generating word vectors as
representations of training texts, enhancing text semantics in the word vector
representation stage. To capture more nuanced interaction information and
context-related details, the paper proposes the Bert-BiLSTM model with a dual attention
mechanism(BB-DAM), which divides the original input sequence into three
parts: above, aspectual words, and below. A dual attention mechanism was used
to assess the interaction of aspect words with the three aspects (above, below,
and neighboring words) in the three discourse segments. This mechanism allows
for the comprehensive extraction of interaction information. By comparing with
other modeling approaches, the experimental results show that the BB-DAM model
produces good results in fine-grained cross-domain sentiment analysis.