Abstract
Designing advanced neural architectures to tackle specific tasks involves weeks or even months of intensive investigation by experts with rich domain knowledge. In recent years, neural architecture search (NAS) has attracted the interest of many researchers due to its ability to automatically design efficient neural architectures. Among different search strategies, evolutionary algorithms have achieved significant successes as derivative-free optimization algorithms. However, the tremendous computational resource consumption of the evolutionary neural architecture search dramatically restricts its application. In this paper, we explore how fitness approximation-based evolutionary algorithms can be applied to neural architecture search and propose NAS-EA-FA to accelerate the search process. We further exploit data augmentation and diversity of neural architectures to enhance the algorithm, and present NAS-EA-FA V2. Experiments show that NAS-EA-FA V2 is at least five times faster than other state-of-the-art neural architecture search algorithms like regularized evolution and iterative neural predictor on NASBench-101, and it is also the most effective and stable algorithm on NASBench-201. All the code used in this paper is available at https://github.com/fzjcdt/NAS-EA-FA. © 2021 IEEE.
Original language | English |
---|---|
Title of host publication | Proceedings of the International Joint Conference on Neural Networks |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Volume | 2021-July |
ISBN (Print) | 9780738133669 |
DOIs | |
Publication status | Published - 18 Jul 2021 |
Externally published | Yes |
Funding
This work is supported by the IEEE Computational Intelligence Society Graduate Student Research Grant 2020. This work was also supported by the Guangdong Provincial Key Laboratory (Grant No. 2020B121201001), the Program for Guangdong Introducing Innovative and Enterpreneurial Teams (Grant No. 2017ZT07X386), Shenzhen Science and Technology Program (Grant No. KQTD2016112514355531), and the Program for University Key Laboratory of Guangdong Province (Grant No. 2017KSYS008).
Keywords
- Diversity
- Evolutionary Algorithm
- Fitness Approximation
- Neural Architecture Search