Publisher
Springer Science and Business Media LLC
Reference28 articles.
1. Agostinelli F, Hoffman MD, Sadowski PJ, Baldi P. Learning activation functions to improve deep neural networks. In: Bengio Y, LeCun Y (eds) 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Workshop Track Proceedings, pp. 1–9 (2015)
2. Akiba T, Sano S, Yanase T, Ohta T, Koyama M. Optuna: A next-generation hyperparameter optimization framework. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, pp. 2623–2631 (2019)
3. Apicella A, Donnarumma F, Isgrò F, Prevete R. A survey on modern trainable activation functions. Neural Netw (2021)
4. Basirat M, Roth PM. The quest for the golden activation function (2018). arXiv preprint arXiv:1808.00783
5. Bergstra J, Yamins D, Cox DD, et al. Hyperopt: a python library for optimizing the hyperparameters of machine learning algorithms. In: Proceedings of the 12th Python in science conference, vol. 13, p. 20. Citeseer (2013)
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献