A Probabilistic Derivation of LASSO and L12-Norm Feature Selections

Authors

  • Di Ming The University of Texas at Arlington
  • Chris Ding The University of Texas at Arlington
  • Feiping Nie Northwestern Polytechnical University

DOI:

https://doi.org/10.1609/aaai.v33i01.33014586

Abstract

LASSO and ℓ2,1-norm based feature selection had achieved success in many application areas. In this paper, we first derive LASSO and ℓ1,2-norm feature selection from a probabilistic framework, which provides an independent point of view from the usual sparse coding point of view. From here, we further propose a feature selection approach based on the probability-derived ℓ1,2-norm. We point out some inflexibility in the standard feature selection that the feature selected for all different classes are enforced to be exactly the same using the widely used ℓ2,1-norm, which enforces the joint sparsity across all the data instances. Using the probabilityderived ℓ1,2-norm feature selection, allowing certain flexibility that the selected features do not have to be exactly same for all classes, the resulting features lead to better classification on six benchmark datasets.

Downloads

Published

2019-07-17

How to Cite

Ming, D., Ding, C., & Nie, F. (2019). A Probabilistic Derivation of LASSO and L12-Norm Feature Selections. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4586-4593. https://doi.org/10.1609/aaai.v33i01.33014586

Issue

Section

AAAI Technical Track: Machine Learning