Abstract
Many real-world problems are computationally costly and the objective functions evolve over time. Data-driven, a.k.a. surrogate-assisted, evolutionary optimization has been recognized as an effective approach to tackle expensive black-box optimization problems in a static environment whereas it has rarely been studied under dynamic environments. This article proposes a simple yet effective transfer learning framework to empower data-driven evolutionary optimization to solve expensive dynamic optimization problems. Specifically, a hierarchical multioutput Gaussian process is proposed to capture the correlation among data collected from different time steps with a linearly increased number of hyperparameters. Furthermore, an adaptive source task selection along with a bespoke warm staring initialization mechanisms are proposed to better leverage the knowledge extracted from previous optimization processes. By doing so, the data-driven evolutionary optimization can jump start the optimization in the new environment with a very limited computational budget. Experiments on synthetic benchmark test problems and a real-world case study demonstrate the effectiveness of our proposed algorithm in comparison with nine state-of-the-art peer algorithms.
Original language | English |
---|---|
Pages (from-to) | 1396-1411 |
Number of pages | 16 |
Journal | IEEE Transactions on Evolutionary Computation |
Volume | 28 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2024 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 1997-2012 IEEE.
Funding
This work was supported in part by the UKRI Future Leaders Fellowship under Grant MR/S017062/1 and Grant MR/X011135/1; in part by NSFC under Grant 62376056 and Grant 62076056; in part by the Royal Society under Grant IES/R2/212077; in part by EPSRC under Grant 2404317; in part by the Kan Tong Po Fellowship under Grant KTP\R1\231017; and in part by the Amazon Research Award and Alan Turing Fellowship.
Keywords
- Data-driven evolutionary optimization
- dynamic optimization
- kernel methods
- multiouput Gaussian processes (GPs)
- transfer optimization