Scalability of generalized adaptive differential evolution for large-scale continuous optimization

Zhenyu YANG, Ke TANG, Xin YAO

Research output: Journal PublicationsJournal Article (refereed)peer-review

98 Citations (Scopus)


Differential evolution (DE) has become a very powerful tool for global continuous optimization problems. Parameter adaptations are the most commonly used techniques to improve its performance. The adoption of these techniques has assisted the success of many adaptive DE variants. However, most studies on these adaptive DEs are limited to some small-scale problems, e.g. with less than 100 decision variables, which may be quite small comparing to the requirements of real-world applications. The scalability performance of adaptive DE is still unclear. In this paper, based on the analyses of similarities and drawbacks of existing parameter adaptation schemes in DE, we propose a generalized parameter adaptation scheme. Applying the scheme to DE results in a new generalized adaptive DE (GaDE) algorithm. The scalability performance of GaDE is evaluated on 19 benchmark functions with problem scale from 50 to 1,000 decision variables. Based on the comparison with three other algorithms, GaDE is very competitive in both the performance and scalability aspects. © 2010 Springer-Verlag.
Original languageEnglish
Pages (from-to)2141-2155
Number of pages15
JournalSoft Computing
Issue number11
Early online date10 Sept 2010
Publication statusPublished - Nov 2011
Externally publishedYes

Bibliographical note

This work was partially supported by the Fund for Foreign Scholars in University Research and Teaching Programs (Grant No. B07033), National Natural Science Foundation of China Grants (No. 60802036 and U0835002), and an EPSRC project (No. EP/D052785/1) on “SEBASE: Software Engineering By Automated SEarch”.


  • Differential evolution
  • Large-scale optimization
  • Parameter adaptation
  • Scalability


Dive into the research topics of 'Scalability of generalized adaptive differential evolution for large-scale continuous optimization'. Together they form a unique fingerprint.

Cite this