Scalable teacher-forcing networks under spark environments for large-scale streaming problems

Date

2020

Authors

Za'in, C.
Ashfahani, A.
Pratama, M.
Lughofer, E.
Pardede, E.

Editors

Castellano, G.
Castiello, C.
Mencar, C.

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

IEEE Conference on Evolving and Adaptive Intelligent Systems, 2020 / Castellano, G., Castiello, C., Mencar, C. (ed./s), vol.2020-May, iss.9122752, pp.1-8

Statement of Responsibility

Conference Name

12th IEEE International Conference on Evolving and Adaptive Intelligent Systems, EAIS 2020 (27 May 2020 - 29 May 2020 : Bari, Italy)

Abstract

Large-scale data streams remains an open issue in the existing literature. It features a never ending information flow, mostly going beyond the capacity of a single processing node. Nonetheless, algorithmic development of large-scale streaming algorithms under distributed platforms faces major challenge due to the scalability issue. The network complexity exponentially grows with the increase of data batches, leading to an accuracy loss if the model fusion phase is not properly designed. A largescale streaming algorithm, namely Scalable Teacher Forcing Network (ScatterNet), is proposed here. ScatterNet has an elastic structure to handle the concept drift in the local scale within the data batch or in the global scale across batches. It is built upon the teacher forcing concept providing a short-term memory aptitude. ScatterNet features the data-free model fusion approach which consists of the zero-shot merging mechanism and the online model selection. Our numerical study demonstrates the moderate improvement of prediction accuracy by ScatterNet while gaining competitive advantage in terms of the execution time compared to its counterpart.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

Copyright 2020 IEEE

License

Grant ID

Call number

Persistent link to this record