Abstract
Conjugate gradient methods play a vital role in finding solutions of large-scale optimization problems due to their simplicity to implement, low memory requirements and as well as their convergence properties. In this paper, we propose a new conjugate gradient method that has a direction satisfying the sufficient descent property. We establish global convergence of the new method under the strong Wolfe line search conditions. Numerical results show that the new method performs better than other relevant methods in the literature. Furthermore, we use the new method to solve a portfolio selection problem.
Subject
Management Science and Operations Research,Computer Science Applications,Theoretical Computer Science
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献