Similarity surrogate-assisted evolutionary neural architecture search with dual encoding strategy
-
Published:2024
Issue:2
Volume:32
Page:1017-1043
-
ISSN:2688-1594
-
Container-title:Electronic Research Archive
-
language:
-
Short-container-title:era
Author:
Xue Yu1, Zhang Zhenman1, Neri Ferrante12
Affiliation:
1. School of Computer Science, Nanjing University of Information Science and Technology, Jiangsu, China 2. School of Computer Science and Electronic Engineering, University of Surrey, United Kingdom
Abstract
<abstract><p>Neural architecture search (NAS), a promising method for automated neural architecture design, is often hampered by its overwhelming computational burden, especially the architecture evaluation process in evolutionary neural architecture search (ENAS). Although there are surrogate models based on regression or ranking to assist or replace the neural architecture evaluation process in ENAS to reduce the computational cost, these surrogate models are still affected by poor architectures and are not able to accurately find good architectures in a search space. To solve the above problems, we propose a novel surrogate-assisted NAS approach, which we call the similarity surrogate-assisted ENAS with dual encoding strategy (SSENAS). We propose a surrogate model based on similarity measurement to select excellent neural architectures from a large number of candidate architectures in a search space. Furthermore, we propose a dual encoding strategy for architecture generation and surrogate evaluation in ENAS to improve the exploration of well-performing neural architectures in a search space and realize sufficiently informative representations of neural architectures, respectively. We have performed experiments on NAS benchmarks to verify the effectiveness of the proposed algorithm. The experimental results show that SSENAS can accurately find the best neural architecture in the NAS-Bench-201 search space after only 400 queries of the tabular benchmark. In the NAS-Bench-101 search space, it can also get results that are comparable to other algorithms. In addition, we conducted a large number of experiments and analyses on the proposed algorithm, showing that the surrogate model measured via similarity can gradually search for excellent neural architectures in a search space.</p></abstract>
Publisher
American Institute of Mathematical Sciences (AIMS)
Subject
General Mathematics
Reference44 articles.
1. C. Swarup, K. U. Singh, A. Kumar, S. K. Pandey, N. varshney, T. Singh, Brain tumor detection using CNN, AlexNet & GoogLeNet ensembling learning approaches, Electron. Res. Arch., 31 (2023), 2900–2924. https://doi.org/10.3934/era.2023146 2. X. He, K. Zhao, X. Chu, AutoML: A survey of the state-of-the-art, Knowledge-Based Syst., 212 (2021), 106622. https://doi.org/10.1016/j.knosys.2020.106622 3. B. Zoph, Q. V. Le, Neural architecture search with reinforcement learning, in 5th International Conference on Learning Representations, (2017), 1–16. 4. P. Ren, Y. Xiao, X. Chang, P. Huang, Z. Li, X. Chen, et al., A comprehensive survey of neural architecture search: Challenges and solutions, ACM Comput. Surv., 54 (2022), 1–34. https://doi.org/10.1145/3447582 5. B. Lyu, S. Wen, K. Shi, T. Huang, Multiobjective reinforcement learning-based neural architecture search for efficient portrait parsing, IEEE Trans. Cybern., 53 (2023), 1158–1169. https://doi.org/10.1109/TCYB.2021.3104866
|
|