Author:
Kamilu K., ,Sulaiman M. I.,Muhammad A. L.,Mohamad A. W.,Mamat M., , , ,
Abstract
In this paper, we construct a new conjugate gradient method for solving unconstrained optimization problems. The proposed method satisfies the sufficient decent property irrespective of the line search and the global convergence was established under some suitable. Further, the new method was used to train different sets of data via a feed forward neural network. Results obtained show that the proposed algorithm significantly reduces the computational time by speeding up the directional minimization with a faster convergence rate.
Publisher
Lviv Polytechnic National University
Subject
Computational Theory and Mathematics,Computational Mathematics
Reference44 articles.
1. Sulaiman I. M., Mamat M. A new conjugate gradient method with descent properties and its application to regression analysis. Journal of Numerical Analysis, Industrial and Applied Mathematics. 14 (1-2), 25-39 (2020).
2. Dennis J. E., Schnable R. B. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. SIAM, Philadelphia (1993).
3. Abashar A., Mamat M., Rivaie M., Ismail M. Global convergence properties of a new class of conjugate gradient method for unconstrained optimization. Applied Mathematics and Computation. 8 (67), 3307-3319 (2014).
4. Rivaie M., Mamat M., Mohd I., Fauzi M. A comparative study of conjugate gradient coefficient for unconstrained optimization. Australian Journal of Basic and Applied Sciences. 5 (9), 947-951 (2011).
5. Rivaie M., Mamat M., Leong W. J., Mohd I. A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization. International Journal of Mathematical Analysis. 6 (23), 1131-1146 (2012).
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献