Affiliation:
1. Zhengzhou University of Light Industry
Abstract
The dimensionality reduction techniques are often used to reduce data dimensionality for computational efficiency or other purposes in existing low-rank representation (LRR)-based methods. However, the two steps of dimensionality reduction and learning low-rank representation coefficients are implemented in an independent way; thus, the adaptability of representation coefficients to the original data space may not be guaranteed. This article proposes a novel model, i.e., low-rank representation with adaptive dimensionality reduction (LRRARD) via manifold optimization for clustering, where dimensionality reduction and learning low-rank representation coefficients are integrated into a unified framework. This model introduces a low-dimensional projection matrix to find the projection that best fits the original data space. And the low-dimensional projection matrix and the low-rank representation coefficients interact with each other to simultaneously obtain the best projection matrix and representation coefficients. In addition, a manifold optimization method is employed to obtain the optimal projection matrix, which is an unconstrained optimization method in a constrained search space. The experimental results on several real datasets demonstrate the superiority of our proposed method.
Funder
National Natural Science Foundation of China
Key Technologies R&D Program of Henan Province
Academic Degrees & Graduate Education Reform Project of Henan Province
Key Research Project of Colleges and Universities of Henan Province
Key Science and Technology Development Program of Henan Province
Science and Technology Project of Henan Province
Training Program of Young Backbone Teachers in Colleges and Universities of Henan Province
Research on Key Technologies of Blockchain System Security
Startup Project of Doctor Scientific Research of Zhengzhou University of Light Industry
Publisher
Association for Computing Machinery (ACM)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献