An accelerated lyapunov function for Polyak’s Heavy-ball on convex quadratics

Author:

Orvieto AntonioORCID

Abstract

AbstractIn 1964, Polyak showed that the Heavy-ball method, the simplest momentum technique, accelerates convergence of strongly-convex problems in the vicinity of the solution. While Nesterov later developed a globally accelerated version, Polyak’s original algorithm remains simpler and more widely used in applications such as deep learning. Despite this popularity, the question of whether Heavy-ball is also globally accelerated or not has not been fully answered yet, and no convincing counterexample has been provided. This is largely due to the difficulty in finding an effective Lyapunov function: indeed, most proofs of Heavy-ball acceleration in the strongly-convex quadratic setting rely on eigenvalue arguments. Our work adopts a different approach: studying momentum through the lens of quadratic invariants of simple harmonic oscillators. By utilizing the modified Hamiltonian of Störmer-Verlet integrators, we are able to construct a Lyapunov function that demonstrates an $$O(1/k^2)$$ O ( 1 / k 2 ) rate for Heavy-ball in the case of convex quadratic problems. Our novel proof technique, though restricted to linear regression, is found to work well empirically also on non-quadratic convex problems, and thus provides insights on the structure of Lyapunov functions to be used in the general convex case. As such, our paper makes a promising first step towards potentially proving the acceleration of Polyak’s momentum method and we hope it inspires further research around this question.

Funder

Hector Stiftung

Max Planck Institute for Intelligent Systems

Publisher

Springer Science and Business Media LLC

Reference26 articles.

1. Alimisis, F., Orvieto, A., Bécigneul, G., Lucchi, A.: A continuous-time perspective for modeling acceleration in Riemannian optimization. In: International Conference on Artificial Intelligence and Statistics, (2020)

2. Allen-Zhu, Z., Orecchia, L.: Linear coupling: an ultimate unification of gradient and mirror descent. arXiv:1407.1537 (2014)

3. Attouch, H., Chbani, Z., Riahi, H.: Rate of convergence of the nesterov accelerated gradient method in the subcritical case $$\alpha \le$$ 3. Control, Optimisation and Calculus of Variations, ESAIM (2019)

4. Chang, C.-C., Lin, C.-J.: Libsvm: a library for support vector machines. ACM transactions on intelligent systems and technology (TIST) 2(3), 1–27 (2011)

5. Flammarion, N., Bach, F.: From averaging to acceleration, there is only a step-size. In: Conference on Learning Theory, (2015)

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3