1. DARTS: Differentiable architecture search;liu;Arxiv 1806 09055,2018
2. BOHB: Robust and efficient hyperparameter optimization at scale;falkner;Proc Int Conf Mach Learn,2018
3. Simple statistical gradient-following algorithms for connectionist reinforcement learning
4. Random search for hyper-parameter optimization;bergstra;J Mach Learn Res,2012
5. BlockSwap: Fisher-guided block substitution for network compression on a budget;turner;arXiv 1906 04113,2019