The Capacitated Arc Routing Problem (CARP) is a NP-hard combinatorial optimisation problem with numerous real-world applications. Several divide-and-conquer approaches, controlled by one or more hyperparameters, have been proposed to tackle large-scale CARPs. The tuning of hyperparameters can be computationally expensive due to the lack of priori knowledge, the size of the configuration space, and the time required for solving a CARP instance. Motivated by this time consuming task, we propose a scalable approach based on self-adaptive hierarchical decomposition (SASAHiD) to scale up existing methods. We take a state-of-the-art decomposition method for large-scale CARPs called SAHiD as an example to carry out experiments on two sets of real-world CARP instances with hundreds to thousands of tasks. The results demonstrate that SASAHiD outperforms SAHiD significantly with fewer hyperparameters, thus the dimension of associated configuration space is reduced. Moreover, we propose an incremental hyperparameter tuning approach across multiple problem instances to learn the hyperparameters of SASAHiD on a set of instances with different sizes. SASAHiD with optimised hyperparameters achieves better or competitive results with the SASAHiD using default hyperparameters when solving problem instances that it has never seen in the training set. © 2019 IEEE.
|Title of host publication
|2019 IEEE Symposium Series on Computational Intelligence, SSCI 2019
|Institute of Electrical and Electronics Engineers Inc.
|Number of pages
|Published - Dec 2019
Bibliographical noteThis work was supported by National Key R&D Program of China (Grant No. 2017YFC0804003), National Natural Science Foundation of China (Grant No. 61906083, 61976111), Shenzhen Peacock Plan (Grant No. KQTD2016112514355531), the Science and Technology Innovation Committee Foundation of Shenzhen (Grant No. ZDSYS201703031748284, JCYJ20180504165652917) and the Program for University Key Laboratory of Guangdong Province (Grant No. 2017KSYS008).
- Capacitated arc routing problem
- incremental hyperparameter tuning
- permutation-based optimisation.
- self-adaptive decomposition