Abstract
AbstractIn the previous paper Bello-Cruz et al. (J Optim Theory Appl 188:378–401, 2021), we showed that the quadratic growth condition plays a key role in obtaining Q-linear convergence of the widely used forward–backward splitting method with Beck–Teboulle’s line search. In this paper, we analyze the property of quadratic growth condition via second-order variational analysis for various structured optimization problems that arise in machine learning and signal processing. This includes, for example, the Poisson linear inverse problem as well as the $$\ell _1$$
ℓ
1
-regularized optimization problems. As a by-product of this approach, we also obtain several full characterizations for the uniqueness of optimal solution to Lasso problem, which complements and extends recent important results in this direction.
Funder
Australian Research Council
National Science Foundation
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Management Science and Operations Research,Control and Optimization
Reference53 articles.
1. Aragón Artacho, F.J., Geoffroy, M.H.: Characterizations of metric regularity of subdifferentials. J. Convex Anal. 15, 365–380 (2008)
2. Aragón Artacho, F.J., Geoffroy, M.H.: Metric subregularity of the convex subdifferential in Banach spaces. J. Nonlinear Convex Anal. 15, 35–47 (2014)
3. Azé, D., Corvellec, J.-N.: Nonlinear local error bounds via a change of metric. J. Fixed Point Theory Appl. 16, 251–372 (2014)
4. Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42, 330–348 (2017)
5. Bauschke, H.H., Combettes, P.L.: Convex Analysis and Monotone Operator Theory in Hilbert Spaces. Springer, New York (2011)
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献