Abstract
AbstractIn a Hilbertian framework, for the minimization of a general convex differentiable function f, we introduce new inertial dynamics and algorithms that generate trajectories and iterates that converge fastly towards the minimizer of f with minimum norm. Our study is based on the non-autonomous version of the Polyak heavy ball method, which, at time t, is associated with the strongly convex function obtained by adding to f a Tikhonov regularization term with vanishing coefficient $$\varepsilon (t)$$
ε
(
t
)
. In this dynamic, the damping coefficient is proportional to the square root of the Tikhonov regularization parameter $$\varepsilon (t)$$
ε
(
t
)
. By adjusting the speed of convergence of $$\varepsilon (t)$$
ε
(
t
)
towards zero, we will obtain both rapid convergence towards the infimal value of f, and the strong convergence of the trajectories towards the element of minimum norm of the set of minimizers of f. In particular, we obtain an improved version of the dynamic of Su-Boyd-Candès for the accelerated gradient method of Nesterov. This study naturally leads to corresponding first-order algorithms obtained by temporal discretization. In the case of a proper lower semicontinuous and convex function f, we study the proximal algorithms in detail, and show that they benefit from similar properties.
Publisher
Springer Science and Business Media LLC
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献