Affiliation:
1. Huawei Inc.
2. Shanghai Jiao Tong University
3. Beijing University of Posts and Telecommunications
Abstract
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual
search methods have been replaced by
weight-sharing
search methods for higher search efficiency, but the latter methods often suffer lower instability. This article provides a literature review on these methods and owes this issue to the
optimization gap
. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and we analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this article mainly focuses on the application of NAS to computer vision problems.
Publisher
Association for Computing Machinery (ACM)
Subject
General Computer Science,Theoretical Computer Science
Reference412 articles.
1. George Adam and Jonathan Lorraine. 2019. Understanding neural architecture search techniques. Retrieved from https://arXiv:1904.00438. George Adam and Jonathan Lorraine. 2019. Understanding neural architecture search techniques. Retrieved from https://arXiv:1904.00438.
2. MaskConnect: Connectivity Learning by Gradient Descent
3. Youhei Akimoto Shinichi Shirakawa Nozomu Yoshinari Kento Uchida Shota Saito and Kouhei Nishida. 2019. Adaptive stochastic natural gradient method for one-shot neural architecture search. Retrieved from https://arXiv:1905.08537. Youhei Akimoto Shinichi Shirakawa Nozomu Yoshinari Kento Uchida Shota Saito and Kouhei Nishida. 2019. Adaptive stochastic natural gradient method for one-shot neural architecture search. Retrieved from https://arXiv:1905.08537.
4. Ultrafast Photorealistic Style Transfer via Neural Architecture Search
Cited by
44 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献