Advancing aspect-based sentiment analysis through deep learning models

Date

2025

Authors

Li, C.
Tang, H.
Zhang, J.
Guo, X.
Cheng, D.
Morimoto, Y.

Editors

Sheng, Q.Z.
Jiang, J.
Zhang, W.E.
Wu, J.
Ma, C.
Dobbie, G.
Zhang, X.
Manolopoulos, Y.
Mansoor, W.

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Book chapter

Citation

Event/exhibition information: 20th International Conference on Advanced Data Mining Applications-ADMA, Sydney, Australia, 03/12/2024-05/12/2024 Source details - Title: Advanced Data Mining And Applications, Adma 2024, 2025 / Sheng, Q.Z., Jiang, J., Zhang, W.E., Wu, J., Ma, C., Dobbie, G., Zhang, X., Manolopoulos, Y., Mansoor, W. (ed./s), vol.15391, Ch.16, pp.228-242

Statement of Responsibility

Conference Name

Abstract

Aspect-based sentiment analysis predicts sentiment polarity with fine granularity. While graph convolutional networks (GCNs) are widely utilized for sentimental feature extraction, their naive application for syntactic feature extraction can compromise information preservation. This study introduces an innovative edge-enhanced GCN, named SentiSys, to navigate the syntactic graph while preserving intact feature information, leading to enhanced performance. Specifically, we first integrate a bidirectional long short-term memory (Bi-LSTM) network and a self-attention-based transformer. This combination facilitates effective text encoding, preventing the loss of information and predicting long dependency text. A bidirectional GCN (Bi-GCN) with message passing is then employed to encode relationships between entities. Additionally, unnecessary information is filtered out using an aspect-specific masking technique. To validate the effectiveness of our proposed model, we conduct extensive evaluation experiments on four benchmark datasets. The experimental results demonstrate enhanced performance in aspect-based sentiment analysis with the use of SentiSys.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2025 The Author(s) Access Condition Notes: Accepted manuscript available after 1 January 2026

License

Grant ID

Call number

Persistent link to this record