Affiliation:
1. University of Lille, CNRS, INRIA, France
Abstract
In recent years, research in applying optimization approaches in the automatic design of deep neural networks has become increasingly popular. Although various approaches have been proposed, there is a lack of a comprehensive survey and taxonomy on this hot research topic. In this article, we propose a unified way to describe the various optimization algorithms that focus on common and important search components of optimization algorithms: representation, objective function, constraints, initial solution(s), and variation operators. In addition to large-scale search space, the problem is characterized by its variable mixed design space, it is very expensive, and it has multiple blackbox objective functions. Hence, this unified methodology has been extended to advanced optimization approaches, such as surrogate-based, multi-objective, and parallel optimization.
Funder
PGMO project and University of Luxembourg
Publisher
Association for Computing Machinery (ACM)
Subject
General Computer Science,Theoretical Computer Science
Cited by
31 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献