Affiliation:
1. Laboratory Informatics and Mathematics (LIM) , Mohamed Cherif Messaadia University , Souk Ahras , Algeria
Abstract
Abstract
Conjugate gradient (CG) methods are a popular class of iterative methods for solving linear systems of equations and nonlinear optimization problems.
In this paper, a new hybrid conjugate gradient (CG) method is presented and analyzed for solving unconstrained optimization problems, where the parameter
β
k
\beta_{k}
is a convex combination of
β
k
WYL
\beta_{k}^{\mathrm{WYL}}
and
β
k
CD
\beta_{k}^{\mathrm{CD}}
.
Under the strong Wolfe line search, the new method possesses the sufficient descent condition and the global convergence properties.
The preliminary numerical results show the efficiency of our method in comparison with other CG methods.
Furthermore, the proposed algorithm HWYLCD was extended to solve the problem of a mode function.
Reference30 articles.
1. N. Andrei,
An unconstrained optimization test functions collection,
Adv. Model. Optim. 10 (2008), no. 1, 147–161.
2. N. Andrei,
Another hybrid conjugate gradient algorithm for unconstrained optimization,
Numer. Algorithms 47 (2008), no. 2, 143–156.
3. N. Andrei,
Hybrid conjugate gradient algorithm for unconstrained optimization,
J. Optim. Theory Appl. 141 (2009), no. 2, 249–264.
4. N. Andrei,
New hybrid conjugate gradient algorithms for unconstrained optimization,
Encyclopedia Optimization,
Springer, Boston (2009), 2560–2571.
5. I. Bongartz, A. R. Conn, N. Gould and P. L. Toint,
Constrained and unconstrained testing environment,
ACM Trans. Math. Software (TOMS) 21 (1995), 123–160.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献