Improved Accelerated Gradient Algorithms with Line Search for Smooth Convex Optimization Problems
-
Published:2023-11-16
Issue:
Volume:
Page:
-
ISSN:0217-5959
-
Container-title:Asia-Pacific Journal of Operational Research
-
language:en
-
Short-container-title:Asia Pac. J. Oper. Res.
Author:
Li Ting1ORCID,
Song Yongzhong1ORCID,
Cai Xingju1ORCID
Affiliation:
1. School of Mathematical Sciences, Nanjing Normal University, Jiangsu Key Laboratory for NSLSCS, Nanjing 210023, P. R. China
Abstract
For smooth convex optimization problems, the optimal convergence rate of first-order algorithm is [Formula: see text] in theory. This paper proposes three improved accelerated gradient algorithms with the gradient information at the latest point. For the step size, to avoid using the global Lipschitz constant and make the algorithm converge faster, new adaptive line search strategies are adopted. By constructing a descent Lyapunov function, we prove that the proposed algorithms can preserve the convergence rate of [Formula: see text]. Numerical experiments demonstrate that our algorithms perform better than some existing algorithms which have optimal convergence rate.
Funder
National Natural Science Foundation of China
Postgraduate Research and Practice Innovation Program of Jiangsu Province
Publisher
World Scientific Pub Co Pte Ltd
Subject
Management Science and Operations Research,Management Science and Operations Research