AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
A General Framework for Sparsity Regularized Feature Selection via Iteratively Reweighted Least Square Minimization
Hanyang Peng, Yong Fan

Last modified: 2017-02-13

Abstract


A variety of feature selection methods based on sparsity regularization have been developed with different loss functions and sparse regularization functions. Capitalizing on the existing sparsity regularized feature selection methods, we propose a general sparsity feature selection (GSR-FS) algorithm that optimizes a ℓ2,r (0 < r ≤ 2) based loss function with a ℓ2,p-norm (0 < p ≤ 2) sparse regularization. The ℓ2,r-norm (0 < 𝑟 ≤ 2) based loss function brings flexibility to balance data-fitting and robustness to outliers by tuning its parameter, and the ℓ2,p-norm (0 < p ≤ 1) based regularization function is able to boost the sparsity for feature selection. To solve the optimization problem with multiple non-smooth and non-convex functions when , we develop an efficient solver under the general umbrella of Iterative Reweighted Least Square (IRLS) algorithms. Our algorithm has been proved to converge with a theoretical convergence order of min(2 – r, 2 – p) at least . The experimental results have demonstrated that our method could achieve competitive feature selection performance on publicly available datasets compared with state-of-the-art feature selection methods, with reduced computational cost.

Keywords


feature selection; general framework; sparsity regularization

Full Text: PDF