Abstract
This article proposes a continuous-time optimization approch instead of tranditional optimiztion methods to address the nuclear norm minimization (NNM) problem. Refomulating the NNM into a matrix form, we propose a Lagrangian programming neural network (LPNN) to solve the NNM. Moreover, the convergence condtions of LPNN are presented by the Lyapunov method. Convergence experiments are presented to demonstrate the convergence of LPNN. Compared with tranditional algorithms of NNM, the proposed algorithm outperforms in terms of image recovery.
Funder
The Science and Technology Research Program of Chongqing Municipal Education Commission
The Opening fund of Chongqing Engineering Research Center of Internet of Things and Intelligent Control Technology
Science and Technology Innovation Smart Agriculture Project of Science and Technology Department, Wanzhou District of Chongqing
The Opening Project of Sichuan Province University Key Laboratory of Bridge Non-destruction Detecting and Engineering Computing
Publisher
Public Library of Science (PLoS)
Reference25 articles.
1. On the rank minimization problem over a positive semidefinite linear matrix inequality[J];M Mesbahi;IEEE Transactions on Automatic Control,1997
2. A nonnegative latent factor model for large-scale sparse matrices in recommender systems via alternating direction method[J];X Luo;IEEE transactions on neural networks and learning systems,2015
3. Shape and motion from image streams under orthography: a factorization method[J];C Tomasi;International journal of computer vision,1992
4. Exact matrix completion via convex optimization[J];E Candes;Communications of the ACM,2012
5. Wright J, Ganesh A, Rao S, et al. Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization[J]. Advances in neural information processing systems, 2009, 22.