Author:
El Jaafari Ilyas,Ellahyani Ayoub,Charfi Said
Abstract
Abstract
Convolution neural network takes in input image, assign importance (learnable weights and biases) to various aspects/objects in the image and be able to differentiate one from the other. At each layer, a linear transformation is carried out on the received data. Aiming for solving non-linear problems, a mandatory unit named activation function is used by neural networks. The activation function greatly influences the success of training deep neural networks. In this paper, a new rectified non linear function unit that we call Rectified non-linear unit (ReNU) is proposed. The presented activation function returns x − log(x + 1) for positive values, and zero for negative ones. The ReNU multiplies the received gradient by values between 0 and 1 depending on the importance of neurons ( 1 for the biggest neurons and 0 for the smaller ones) unlike the ReLU that returns the same received gradient for all positive values in its back propagation. Using the proposed activation function the CNN performance remarkably raises. The ReNU has been tested on MNIST and CIFAR-10 data sets and compared to the ReLU and ELU activation functions. The experimental results are satisfactory when compared to known activation functions in terms of convergence speed and CNN accuracy.
Subject
General Physics and Astronomy
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献