Journal of Xidian University ›› 2019, Vol. 46 ›› Issue (1): 64-72.doi: 10.19665/j.issn1001-2400.2019.01.011

Previous Articles     Next Articles

Robust support vector machines and their sparse algorithms

AN Yali,ZHOU Shuisheng(),CHEN Li,WANG Baojun   

  1. School of Mathematics and Statistics, Xidian Univ., Xi’an 710071, China
  • Received:2018-04-03 Online:2019-02-20 Published:2019-03-05
  • Contact: Shuisheng ZHOU E-mail:sszhou@mail.xidian.edu.cn

Abstract:

Based on nonconvex and smooth loss, the robust support vector machine (RSVM) is insenstive to outliers for classification problems. However, the existing algorithms for RSVM are not suitable for dealing with large-scale problems, because they need to iteratively solve quadratic programmings, which leads to a large amount of calculation and slow convergence. To overcome this drawback, the method with a faster convergence rate is used to solve the RSVM. Then, by using the idea of least square, a generalized exponentially robust LSSVM (ER-LSSVM) model is proposed, which is solved by the algorithm with a faster convergence rate. Moreover, the robustness of the ER-LSSVM is interpreted theoretically. Finally, ultilizing low-rank approximation of the kernel matrix, the sparse RSVM algorithm (SR-SVM) and sparse ER-LSSVM algorithm (SER-LSSVM) are proposed for handing large-scale problems. Many experimental results illustrate that the proposed algorithm outperforms the related algorithms in terms of convergence speed, test accuracy and training time.

Key words: robust support vector machines, nonconvex and smooth loss, sparse solution, low-rank approximation

CLC Number: 

  • TP391