Abstract
At the beginning of this century, which is characterized by huge flows of emerging data, Dai and Liao proposed a pervasive conjugacy condition that triggered the interest of many optimization scholars. Recognized as a sophisticated conjugate gradient (CG) algorithm after about two decades, here we share our visions and thoughts on the method in the framework of a review study. In this regard, we first discuss the modified Dai–Liao methods based on the modified secant equations given in the literature, mostly with the aim of applying the objective function values in addition to the gradient information. Then, several adaptive, sort of optimal choices for the parameter of the method are studied. Especially, we devote a part of our study to the modified versions of the Hager–Zhang and Dai–Kou CG algorithms, being well-known members of the Dai–Liao class of CG methods. Extensions of the classical CG methods based on the Dai–Liao approach are also reviewed. Finally, we discuss the optimization models of practical disciplines that have been addressed by the Dai–Liao approach, including the nonlinear systems of equations, image restoration and compressed sensing.
Subject
Management Science and Operations Research,Computer Science Applications,Theoretical Computer Science
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献