Modified Backpropagation Algorithm with Multiplicative Calculus in Neural Networks
-
Published:2023-06-27
Issue:3
Volume:29
Page:55-61
-
ISSN:2029-5731
-
Container-title:Elektronika ir Elektrotechnika
-
language:
-
Short-container-title:ELEKTRON ELEKTROTECH
Abstract
Backpropagation is one of the most widely used algorithms for training feedforward deep neural networks. The algorithm requires a differentiable activation function and it performs computations of the gradient proceeding backwards through the feedforward deep neural network from the last layer through to the first layer. In order to calculate the gradient at a specific layer, the gradients of all layers are combined via the chain rule of calculus. One of the biggest disadvantages of the backpropagation is that it requires a large amount of training time. To overcome this issue, this paper proposes a modified backpropagation algorithm with multiplicative calculus. Multiplicative calculus provides an alternative to the classical calculus and it defines new kinds of derivative and integral forms in multiplicative form rather than addition and subtraction forms. The performance analyzes are discussed in various case studies and the results are given comparatively with classical backpropagation algorithm. It is found that the proposed modified backpropagation algorithm converges in less time to the solution and thus provides fast training in the given case studies. It is also shown that the proposed algorithm avoids the local minima problem.
Publisher
Kaunas University of Technology (KTU)
Subject
Electrical and Electronic Engineering