Abstract
As a key step to endow the neural network with nonlinear factors, the activation function is crucial to the performance of the network. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) for deep neural networks. Compared with existing activation functions, the proposed EANAF requires less computational effort, and it is self-regularized, asymmetric and non-monotonic. These desired characteristics facilitate the outstanding performance of the proposed EANAF. To demonstrate the effectiveness of this function in the field of object detection, the proposed activation function is compared with several state-of-the-art activation functions on the typical backbone networks such as ResNet and DSPDarkNet. The experimental results demonstrate the superior performance of the proposed EANAF.
Funder
Ningbo Municipal Bureau Science and Technology
National Natural Science Foundation of China
University of Nottingham Ningbo China
Subject
Physics and Astronomy (miscellaneous),General Mathematics,Chemistry (miscellaneous),Computer Science (miscellaneous)
Reference30 articles.
1. Gradient-based learning applied to document recognition
2. Long Short-Term Memory-Networks for Machine Reading;Cheng;arXiv,2016
3. Neural Networks for Pattern Recognition;Bishop,1993
4. The piecewise non-linear approximation of the sigmoid function and its implementation in FPGA;Yukun;Appl. Electron. Technol.,2017
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献