AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Multiclass Capped ℓp-Norm SVM for Robust Classifications
Feiping Nie, Xiaoqian Wang, Heng Huang

Last modified: 2017-02-13

Abstract


Support vector machine (SVM) model is one of most successful machine learning methods and has been successfully applied to solve numerous real-world application. Because the SVM methods use the hinge loss or squared hinge loss functions for classifications, they usually outperform other classification approaches, e.g. the least square loss function based methods. However, like most supervised learning algorithms, they learn classifiers based on the labeled data in training set without specific strategy to deal with the noise data. In many real-world applications, we often have data outliers in train set, which could misguide the classifiers learning, such that the classification performance is suboptimal. To address this problem, we proposed a novel capped Lp-norm SVM classification model by utilizing the capped `p-norm based hinge loss in the objective which can deal with both light and heavy outliers. We utilize the new formulation to naturally build the multiclass capped Lp-norm SVM. More importantly, we derive a novel optimization algorithms to efficiently minimize the capped Lp-norm based objectives, and also rigorously prove the convergence of proposed algorithms. We present experimental results showing that employing the new capped Lp-norm SVM method can consistently improve the classification performance, especially in the cases when the data noise level increases.

Keywords


Capped Lp-Norm SVM; Robust Classification

Full Text: PDF