Affiliation:
1. College of Computer Science and Technology, HuaQiao University, Quanzhou, Fujian, China
2. College of Mathematics and Computer Science, Fuzhou University, Fuzhou, Fujian, China
Abstract
Feature selection is an important data preprocessing in data mining and machine learning, that can reduce the number of features without deteriorating model’s performance. Recently, sparse regression has received considerable attention in feature selection task due to its good performance. However, because the l2,0-norm regularization term is non-convex, this problem is hard to solve, and most of the existing methods relaxed it by l2,1-norm. Unlike the existing methods, this paper proposes a novel method to solve the l2,0-norm regularized least squares problem directly based on iterative hard thresholding, which can produce exact row-sparsity solution for weights matrix, and features can be selected more precisely. Furthermore, two homotopy strategies are derived to reduce the computational time of the optimization method, which are more practical for real-world applications. The proposed method is verified on eight biological datasets, experimental results show that our method can achieve higher classification accuracy with fewer number of selected features than the approximate convex counterparts and other state-of-the-art feature selection methods.
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Theoretical Computer Science
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献