Irregularity-Informed Time Series Analysis: Adaptive Modelling of Spatial and Temporal Dynamics
Files
(Published version)
Date
2024
Authors
Zheng, L.N.
Li, Z.
Dong, C.G.
Zhang, W.E.
Yue, L.
Xu, M.
Maennel, O.
Chen, W.
Editors
Advisors
Journal Title
Journal ISSN
Volume Title
Type:
Conference paper
Citation
Proceedings of the 33rd ACM International Conference on Information and Knowledge Management (CIKM 2024), 2024, pp.3405-3414
Statement of Responsibility
Liangwei Nathan Zheng, Zhengyang Li, Chang George Dong, Wei Emma Zhang, Lin Yue, Miao Xu, Olaf Maennel, Weitong Chen
Conference Name
33rd ACM International Conference on Information and Knowledge Management (CIKM) (21 Oct 2024 - 25 Oct 2024 : Boise, Idaho, USA)
Abstract
Irregular Time Series Data (IRTS) has shown increasing prevalence in real-world applications. We observed that IRTS can be divided into two specialized types: Natural Irregular Time Series (NIRTS) and Accidental Irregular Time Series (AIRTS). Various existing methods either ignore the impacts of irregular patterns or statically learn the irregular dynamics of NIRTS and AIRTS data and suffer from limited data availability due to the sparsity of IRTS. We proposed a novel transformer-based framework for general irregular time series data that treats IRTS from four views: Locality, Time, Spatio and Irregularity to motivate the data usage to the highest potential. Moreover, we design a sophisticated irregularity-gate mechanism to adaptively select task-relevant information from irregularity, which improves the generalization ability to various IRTS data. We implement extensive experiments to demonstrate the resistance of our work to three highly missing ratio datasets (88.4%, 94.9%, 60% missing value) and investigate the significance of the irregularity information for both NIRTS and AIRTS by additional ablation study. We release our implementation in https://github.com/IcurasLW/MTSFormer-Irregular_Time_Series.git.
School/Discipline
Dissertation Note
Provenance
Description
Access Status
Rights
© 2024 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution International 4.0 License.