Author:
Babaie-Kafaki Saman,Dargahi Fatemeh,Aminifard Zohre
Abstract
AbstractWe suggest a revised form of a classic measure function to be employed in the optimization model of the nonnegative matrix factorization problem. More exactly, using sparse matrix approximations, the revision term is embedded to the model for penalizing the ill-conditioning in the computational trajectory to obtain the factorization elements. Then, as an extension of the Euclidean norm, we employ the ellipsoid norm to gain adaptive formulas for the Dai–Liao parameter in a least-squares framework. In essence, the parametric choices here are obtained by pushing the Dai–Liao direction to the direction of a well-functioning three-term conjugate gradient algorithm. In our scheme, the well-known BFGS and DFP quasi–Newton updating formulas are used to characterize the positive definite matrix factor of the ellipsoid norm. To see at what level our model revisions as well as our algorithmic modifications are effective, we seek some numerical evidence by conducting classic computational tests and assessing the outputs as well. As reported, the results weigh enough value on our analytical efforts.
Funder
Libera Università di Bolzano
Publisher
Springer Science and Business Media LLC
Reference31 articles.
1. Ahookhosh, M., Hien, L.T.K., Gillis, N., Patrinos, P.: Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization. Comput. Optim. Appl. 79(3), 681–715 (2021)
2. Aminifard, Z., Babaie-Kafaki, S.: An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR. 17, 317–330 (2019)
3. Andrei, N.: Modern Numerical Nonlinear Optimization. Switzerland, Springer, Cham (2006)
4. Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)
5. Babaie-Kafaki, S.: On the sufficient descent condition of the Hager–Zhang conjugate gradient methods. 4OR. 12(3), 285–292 (2014)