Affiliation:
1. School of Electrical Engineering, Computation and Mathematical Sciences, Curtin University, Kent Street, Perth, WA 6102, Australia
Abstract
This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and nonlocality, its application in neural networks is often limited by a lack of physical interpretability and inconsistencies with traditional models. To address these challenges, we propose a mixed integer-fractional (MIF) gradient descent algorithm for the training of neural networks. Furthermore, a detailed convergence analysis of the proposed algorithm is provided. Finally, numerical experiments illustrate that the new gradient descent algorithm not only speeds up the convergence of the BP neural networks but also increases their classification accuracy.
Funder
Australian Research Council linkage grant
innovative connection grant
industry grant
Reference26 articles.
1. Edwards, C.J. (2012). The Historical Development of the Calculus, Springer Science & Business Media.
2. Oldham, K., and Spanier, J. (1974). The Fractional Calculus Theory and Applications of Differentiation and Integration to Arbitrary Order, Academic Press.
3. Hahn, D.W., and Özisik, M.N. (2012). Heat Conduction, John Wiley & Sons.
4. The concepts and applications of fractional order differential calculus in modeling of viscoelastic systems: A primer;Matlob;Crit. Rev. Biomed. Eng.,2019
5. Fractional-order Legendre–Laguerre functions and their applications in fractional partial differential equations;Dehestani;Appl. Math. Comput.,2018