Abstract
AbstractWe study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form $$(\frac{n L^2}{\mu ^2 k})^{k/2}$$
(
n
L
2
μ
2
k
)
k
/
2
and $$(\frac{n L}{\mu k})^{k/2}$$
(
nL
μ
k
)
k
/
2
respectively, where k is the iteration counter, n is the dimension of the problem, $$\mu $$
μ
is the strong convexity parameter, and L is the Lipschitz constant of the gradient.
Publisher
Springer Science and Business Media LLC
Subject
General Mathematics,Software
Cited by
17 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献