Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3

Author:

Attouch Hedy,Chbani Zaki,Riahi Hassan

Abstract

In a Hilbert space setting , given Φ : → ℝ a convex continuously differentiable function, and α a positive parameter, we consider the inertial dynamic system with Asymptotic Vanishing Damping (AVD)α     (t) + α/tẋ(t) + ∇Φ(x(t)) = 0. Depending on the value of α with respect to 3, we give a complete picture of the convergence properties as t → + of the trajectories generated by (AVD)α, as well as iterations of the corresponding algorithms. Indeed, as shown by Su-Boyd-Candès, the case α = 3 corresponds to a continuous version of the accelerated gradient method of Nesterov, with the rate of convergence Φ(x(t)) − min Φ = O(t−2) for α ≥ 3. Our main result concerns the subcritical case α ≤ 3, where we show that Φ(x(t)) − min Φ = O(t−⅔α). This overall picture shows a continuous variation of the rate of convergence of the values Φ(x(t)) − min Φ = O(tp(α)) with respect to α > 0: the coefficient p(α) increases linearly up to 2 when α goes from 0 to 3, then displays a plateau. Then we examine the convergence of trajectories to optimal solutions. As a new result, in the one-dimensional framework, for the critical value α = 3, we prove the convergence of the trajectories. In the second part of this paper, we study the convergence properties of the associated forward-backward inertial algorithms. They aim to solve structured convex minimization problems of the form min {Θ := Φ + Ψ}, with Φ smooth and Ψ nonsmooth. The continuous dynamics serves as a guideline for this study. We obtain a similar rate of convergence for the sequence of iterates (xk): for α ≤ 3 we have Θ(xk) − min Θ = O(kp) for all p < 2α/3, and for α > 3 Θ(xk) − min Θ = o(k−2). Finally, we show that the results are robust with respect to external perturbations.

Publisher

EDP Sciences

Subject

Computational Mathematics,Control and Optimization,Control and Systems Engineering

Cited by 62 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. A Second Order Primal–Dual Dynamical System for a Convex–Concave Bilinear Saddle Point Problem;Applied Mathematics & Optimization;2024-01-17

2. THE SECOND-ORDER DIFFERENTIAL EQUATION METHOD FOR SOLVING THE VARIATIONAL INEQUALITY PROBLEM;Journal of Applied Analysis & Computation;2024

3. Optimal convergence rate of inertial gradient system with flat geometries and perturbations;Evolution Equations and Control Theory;2024

4. Inertial Newton Algorithms Avoiding Strict Saddle Points;Journal of Optimization Theory and Applications;2023-11-08

5. Accelerated Dynamical Approaches for a Class of Distributed Optimization with Set Constraints;2023 International Conference on New Trends in Computational Intelligence (NTCI);2023-11-03

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3