Abstract
The Conjugate Gradient (CG) method is a powerful iterative approach for solving large-scale minimization problems, characterized by its simplicity, low computation cost and good convergence. In this paper, a new hybrid conjugate gradient HLB method (HLB: Hadji-Laskri-Bechouat) is proposed and analysed for unconstrained optimization. By comparing numerically CGHLB with PRP and RMIL+ and by using the Dolan and More CPU performance, we deduce that CGHLB is more efficient.
Keywords: Unconstrained optimization, hybrid conjugate gradient method, line search, descent property, global convergence.
Publisher
Babes-Bolyai University Cluj-Napoca
Reference28 articles.
1. "[1] Al-Baali, M., Descent property and global convergence of Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal., 5(1985), no. 1, 121-124.
2. [2] Al-Bayati, A.Y., Al-Assady, N.H., Conjugate Gradient Method, Technical Research re- port, Technical Research, School of Computer Studies, Leeds University, 1986.
3. [3] Andrei, N., An unconstrained optimization test functions collection, Adv. Model. Optim., 10(2008), no. 1, 147-161.
4. [4] Andrei, N., Another hybrid conjugate gradient algorithm for unconstrained optimization, Numer. Algorithms, 47(2008), no. 2, 143-156.
5. [5] Andrei, N., Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization, J. Optim. Theory Appl., 141(2009), no. 2, 249-264.