Adaptive Granularity Learning Distributed Particle Swarm Optimization for Large-Scale Optimization

Zi-Jia WANG, Zhi-Hui ZHAN, Sam KWONG, Hu JIN, Jun ZHANG

Research output: Journal PublicationsJournal Article (refereed)peer-review

163 Citations (Scopus)

Abstract

Large-scale optimization has become a significant and challenging research topic in the evolutionary computation (EC) community. Although many improved EC algorithms have been proposed for large-scale optimization, the slow convergence in the huge search space and the trap into local optima among massive suboptima are still the challenges. Targeted to these two issues, this article proposes an adaptive granularity learning distributed particle swarm optimization (AGLDPSO) with the help of machine-learning techniques, including clustering analysis based on locality-sensitive hashing (LSH) and adaptive granularity control based on logistic regression (LR). In AGLDPSO, a master-slave multisubpopulation distributed model is adopted, where the entire population is divided into multiple subpopulations, and these subpopulations are co-evolved. Compared with other large-scale optimization algorithms with single population evolution or centralized mechanism, the multisubpopulation distributed co-evolution mechanism will fully exchange the evolutionary information among different subpopulations to further enhance the population diversity. Furthermore, we propose an adaptive granularity learning strategy (AGLS) based on LSH and LR. The AGLS is helpful to determine an appropriate subpopulation size to control the learning granularity of the distributed subpopulations in different evolutionary states to balance the exploration ability for escaping from massive suboptima and the exploitation ability for converging in the huge search space. The experimental results show that AGLDPSO performs better than or at least comparable with some other state-of-the-art large-scale optimization algorithms, even the winner of the competition on large-scale optimization, on all the 35 benchmark functions from both IEEE Congress on Evolutionary Computation (IEEE CEC2010) and IEEE CEC2013 large-scale optimization test suites.
Original languageEnglish
Article number9049400
Pages (from-to)1175-1188
Number of pages14
JournalIEEE Transactions on Cybernetics
Volume51
Issue number3
Early online date27 Mar 2020
DOIs
Publication statusPublished - Mar 2021
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Funding

This work was supported in part by the Outstanding Youth Science Foundation under Grant 61822602, in part by the National Key Research and Development Program of China under Grant 2019YFB2102100, in part by the National Natural Science Foundations of China under Grant 61772207 and Grant 61873097, in part by the Guangdong Natural Science Foundation Research Team under Grant 2018B030312003, in part by the Ministry of Science and ICT through the National Research Foundation of Korea under Grant NRF-2019H1D3A2A01101977, and in part by the Hong Kong GRF-RGC General Research Fund under Grant 9042489 and Grant CityU 11206317.

Keywords

  • Adaptive granularity learning distributed particle swarm optimization (AGLDPSO)
  • large-scale optimization
  • locality-sensitive hashing (LSH)
  • logistic regression (LR)
  • master-slave multisubpopulation distributed

Fingerprint

Dive into the research topics of 'Adaptive Granularity Learning Distributed Particle Swarm Optimization for Large-Scale Optimization'. Together they form a unique fingerprint.

Cite this