Indexed by:
Abstract:
Feature selection is an important data preprocessing in data mining and machine learning, that can reduce the number of features without deteriorating model's performance. Recently, sparse regression has received considerable attention in feature selection task due to its good performance. However, because the l2,0-norm regularization term is non-convex, this problem is hard to solve, and most of the existing methods relaxed it by l2,1-norm. Unlike the existing methods, this paper proposes a novel method to solve the l2,0-norm regularized least squares problem directly based on iterative hard thresholding, which can produce exact row-sparsity solution for weights matrix, and features can be selected more precisely. Furthermore, two homotopy strategies are derived to reduce the computational time of the optimization method, which are more practical for real-world applications. The proposed method is verified on eight biological datasets, experimental results show that our method can achieve higher classification accuracy with fewer number of selected features than the approximate convex counterparts and other state-of-the-art feature selection methods. © 2022 - IOS Press. All rights reserved.
Keyword:
Reprint 's Address:
Email:
Source :
Intelligent Data Analysis
ISSN: 1088-467X
Year: 2022
Issue: 1
Volume: 26
Page: 57-73
1 . 7
JCR@2022
0 . 9 0 0
JCR@2023
ESI HC Threshold:61
JCR Journal Grade:4
CAS Journal Grade:4
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: