Neural networks in evolutionary dynamic constrained optimization: computational cost and benefits

dc.contributor.authorHasani Shoreh, M.
dc.contributor.authorHermoza Aragones, R.
dc.contributor.authorNeumann, F.
dc.contributor.conferenceEuropean Conference on Artificial Intelligence (ECAI) (29 Aug 2020 - 8 Sep 2020 : Santiago de Compostela, Spain)
dc.contributor.editorDeGiacomo, G.
dc.contributor.editorCatala, A.
dc.contributor.editorDilkina, B.
dc.contributor.editorMilano, M.
dc.contributor.editorBarro, S.
dc.contributor.editorBugarin, A.
dc.contributor.editorLang, J.
dc.date.issued2020
dc.description.abstractNeural networks (NN) have been recently applied together with evolutionary algorithms (EAs) to solve dynamic optimization problems. The applied NN estimates the position of the next optimum based on the previous time best solutions. After detecting a change, the predicted solution can be employed to move the EA’s population to a promising region of the solution space in order to accelerate convergence and improve accuracy in tracking the optimum. While previous works show improvement of the results, they neglect the overhead created by NN. In this work, we reflect the time spent for training NN in the optimization time and compare the results with a baseline EA. We explore if by considering the generated overhead, NN is still able to improve the results, and under which conditions is able to do so. The main difficulties to train the NN are: 1) to get enough samples to generalize predictions for new data, and 2) to obtain reliable samples. As NN needs to collect data at each time step, if the time horizon is short, we will not be able to collect enough samples to train the NN. To alleviate this, we propose to consider more individuals on each time to speed up sample collection in shorter time steps. In environments with high frequency of changes, the solutions produced by EA are likely to be far from the real optimum. Using unreliable train data for the NN will, in consequence, produce unreliable predictions. Also, as the time spent for NN stays fixed regardless of the frequency, a higher frequency of change will mean a higher produced overhead by the NN in proportion to the EA. In general, after considering the generated overhead, we conclude that NN is not suitable in environments with high frequency of changes and/or short time horizons. However, it can be promising for the low frequency of changes, and especially for the environments that changes have a pattern.
dc.description.statementofresponsibilityMaryam Hasani-Shoreh and Renato Hermoza Aragonés and Frank Neumann
dc.identifier.citationInternational Journal of Computer Research, 2020 / DeGiacomo, G., Catala, A., Dilkina, B., Milano, M., Barro, S., Bugarin, A., Lang, J. (ed./s), vol.325, pp.275-282
dc.identifier.doi10.3233/FAIA200103
dc.identifier.isbn9781643681009
dc.identifier.issn0922-6389
dc.identifier.issn1879-8314
dc.identifier.orcidHermoza Aragones, R. [0000-0002-1669-046X]
dc.identifier.orcidNeumann, F. [0000-0002-2721-3618]
dc.identifier.urihttp://hdl.handle.net/2440/128273
dc.language.isoen
dc.publisherIOS Press BV
dc.publisher.placeAmsterdam, Netherlands
dc.relation.granthttp://purl.org/au-research/grants/arc/DP160102401
dc.relation.granthttp://purl.org/au-research/grants/arc/DP180103232
dc.relation.ispartofseriesFrontiers in Artificial Intelligence and Applications; 325
dc.rights© 2020 The authors and IOS Press. This article is published online with Open Access by IOS Press and distributed under the terms of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0).
dc.source.urihttp://ebooks.iospress.nl/doi/10.3233/FAIA325
dc.titleNeural networks in evolutionary dynamic constrained optimization: computational cost and benefits
dc.typeConference paper
pubs.publication-statusPublished

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
hdl_128273.pdf
Size:
613.34 KB
Format:
Adobe Portable Document Format
Description:
Published version