Author:
Zhang Jinchao,Zhu Wei,Wang Wei,Wu Zhaochong,Zhang Xiaojun
Abstract
Abstract
With the goal to deal with a series of optimization problems on general matrix manifolds with differentiable objective functions, we propose an accelerated hybrid Riemannian conjugate gradient technique. Specifically, the acceleration scheme of the proposed method using a modified stepsize which is multiplicatively determined by the Wolfe line search. The search direction of the proposed algorithm is determined by the hybrid conjugate parameter with computationally promising. We showed that the suggested approach converges globally to a stationary point. Our approach performs better than the state of art Riemannian conjugate gradient algorithms, as illustrated by computations on problems such as the orthogonal Procrustes problem and the Brockett-cost-function minimization problem.
Reference15 articles.
1. Low-rank matrix completion by Riemannian optimization;Vandereycken;SIAM J. Optim.,2013
2. A brief introduction to manifold optimization;Hu;J. Oper. Res. Soc. China,2020
3. Low-rank matrix approximation using the Lanczos bidiagonalization process with applications;Simon;SIAM J. Sci. Comput.,2000
4. Joint diagonalization on the oblique manifold for independent component analysis;Absil,2006