Proceedings:
No. 1: Thirty-First AAAI Conference On Artificial Intelligence
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 31
Track:
Machine Learning Methods
Downloads:
Abstract:
Support vector machine (SVM) model is one of most successful machine learning methods and has been successfully applied to solve numerous real-world application. Because the SVM methods use the hinge loss or squared hinge loss functions for classifications, they usually outperform other classification approaches, e.g. the least square loss function based methods. However, like most supervised learning algorithms, they learn classifiers based on the labeled data in training set without specific strategy to deal with the noise data. In many real-world applications, we often have data outliers in train set, which could misguide the classifiers learning, such that the classification performance is suboptimal. To address this problem, we proposed a novel capped Lp-norm SVM classification model by utilizing the capped `p-norm based hinge loss in the objective which can deal with both light and heavy outliers. We utilize the new formulation to naturally build the multiclass capped Lp-norm SVM. More importantly, we derive a novel optimization algorithms to efficiently minimize the capped Lp-norm based objectives, and also rigorously prove the convergence of proposed algorithms. We present experimental results showing that employing the new capped Lp-norm SVM method can consistently improve the classification performance, especially in the cases when the data noise level increases.
DOI:
10.1609/aaai.v31i1.10948
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 31