DATM: A Novel Data Agnostic Topic Modeling Technique With Improved Effectiveness for Both Short and Long Text

Date

2023

Authors

Bewong, M.
Wondoh, J.
Kwashie, S.
Liu, J.
Liu, L.
Li, J.
Islam, M.Z.
Kernot, D.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Journal article

Citation

IEEE Access, 2023; 11:32826-32841

Statement of Responsibility

Conference Name

Abstract

Topic modelling is important for tackling several data mining tasks in information retrieval. While seminal topic modelling techniques such as Latent Dirichlet Allocation (LDA) have been proposed, the ubiquity of social media and the brevity of its texts pose unique challenges for such traditional topic modelling techniques. Several extensions including <italic>auxiliary aggregation</italic>, <italic>self aggregation</italic> and <italic>direct learning</italic> have been proposed to mitigate these challenges, however some still remain. These include a lack of consistency in the topics generated and the decline in model performance in applications involving disparate document lengths. There is a recent paradigm shift towards neural topic models, which are not suited for resource-constrained environments. This paper revisits LDA-style techniques, taking a theoretical approach to analyse the relationship between word co-occurrence and topic models. Our analysis shows that by altering the word co-occurrences within the corpus, topic discovery can be enhanced. Thus we propose a novel data transformation approach dubbed <italic>DATM</italic> to improve the topic discovery within a corpus. A rigorous empirical evaluation shows that DATM is not only powerful, but it can also be used in conjunction with existing benchmark techniques to significantly improve their effectiveness and their consistency by up to 2 fold.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

License

Grant ID

Call number

Persistent link to this record