HOW IMPORTANT ARE ACTIVATION FUNCTIONS IN REGRESSION AND CLASSIFICATION? A SURVEY, PERFORMANCE COMPARISON, AND FUTURE DIRECTIONS
-
Published:2023
Issue:1
Volume:4
Page:21-75
-
ISSN:2689-3967
-
Container-title:Journal of Machine Learning for Modeling and Computing
-
language:en
-
Short-container-title:J Mach Learn Model Comput
Author:
Jagtap Ameya D.,Karniadakis George Em
Abstract
Inspired by biological neurons, the activation functions play an essential part in the learning process of any artificial neural network (ANN) commonly used in many real-world problems. Various activation functions have been proposed in the literature for classification as well as regression tasks. In this work, we survey the activation functions that have been employed in the past as well as the current state-of-the-art. In particular, we present various developments in activation functions over the years and the advantages as well as disadvantages or limitations of these activation functions. We also discuss classical (fixed) activation functions, including rectifier units, and adaptive activation functions. In addition to discussing the taxonomy of activation functions based on characterization, a taxonomy of activation functions based on applications is presented. To this end, the systematic comparison of various fixed and adaptive activation functions is performed for classification datasets such as MNIST, CIFAR-10, and CIFAR-100. In recent years, a physics-informed machine learning framework has emerged for solving problems related to scientific computations. For this purpose, we also discuss various requirements for activation functions that have been used in the physics-informed machine learning framework. Furthermore, various comparisons are made among different fixed and adaptive activation functions using various machine learning libraries such as TensorFlow, PyTorch, and JAX. Our findings show that activation functions such as rectified linear unit (ReLU) and its variants, which are currently the state-of-the-art for many classification problems, do not work well in physics-informed machine learning frameworks due to the stringent requirement of the existence of derivatives, whereas other activation functions such as hyperbolic tangent, swish, and sine give better performance, with superior results achieved with adaptive activation functions, especially for multiscale problems.
Reference216 articles.
1. Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., and Isard, M., TensorFlow: A System for Large-Scale Machine Learning, in 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), Savannah, GA, pp. 265-283, 2016. 2. Agostinelli, F., Hoffman, M., Sadowski, P., and Baldi, P., Learning Activation Functions to Improve Deep Neural Networks, arXiv:1412.6830, 2014. 3. Ahmad, A.S., Hassan, M.Y., Abdullah, M.P., Rahman, H.A., Hussin, F., Abdullah, H., and Saidur, R., A Review on Applications of ANN and SVM for Building Electrical Energy Consumption Forecasting, Renew. Sustain. Energy Rev., vol. 33, pp. 102-109, 2014. 4. Aizenberg, N., Ivas'kiv, Y.L., Pospelov, D., and Khudyakov, G., Multivalued Threshold Functions, Cybernetics, vol. 9, no. 1, pp. 61-77, 1973. 5. Alcaide, E., E-Swish: Adjusting Activations to Different Network Depths, arXiv:1801.07145, 2018.
Cited by
31 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|