西安电子科技大学学报 ›› 2019, Vol. 46 ›› Issue (1): 64-72.doi: 10.19665/j.issn1001-2400.2019.01.011

• • 上一篇    下一篇

鲁棒支持向量机及其稀疏算法

安亚利,周水生(),陈丽,王保军   

  1. 西安电子科技大学 数学与统计学院,陕西 西安 710071
  • 收稿日期:2018-04-03 出版日期:2019-02-20 发布日期:2019-03-05
  • 通讯作者: 周水生
  • 作者简介:安亚利(1992-),女,西安电子科技大学硕士研究生,E-mail: 1171602902@qq.com.
  • 基金资助:
    国家自然科学基金(61772020)

Robust support vector machines and their sparse algorithms

AN Yali,ZHOU Shuisheng(),CHEN Li,WANG Baojun   

  1. School of Mathematics and Statistics, Xidian Univ., Xi’an 710071, China
  • Received:2018-04-03 Online:2019-02-20 Published:2019-03-05
  • Contact: Shuisheng ZHOU

摘要:

基于非凸光滑损失的鲁棒支持向量机分类模型对异常点具有鲁棒性,但已有求解算法需迭代求解二次规划,计算量大且收敛速度慢,不适合训练大规模数据问题。为了克服这些缺点,首先给出收敛速度更快的方法求解鲁棒支持向量机模型;然后基于最小二乘的思想,提出了一种推广的指数鲁棒最小二乘支持向量机模型及其快速收敛的求解算法,并从理论上解释了模型的鲁棒性;最后利用核矩阵的低秩近似,提出了适于处理大规模训练问题的稀疏鲁棒支持向量机算法和稀疏指数鲁棒最小二乘支持向量机算法。实验结果表明,新算法在收敛速度、测试精度和训练时间等方面均优于相关算法。

关键词: 鲁棒支持向量机, 非凸光滑损失, 稀疏解, 低秩近似

Abstract:

Based on nonconvex and smooth loss, the robust support vector machine (RSVM) is insenstive to outliers for classification problems. However, the existing algorithms for RSVM are not suitable for dealing with large-scale problems, because they need to iteratively solve quadratic programmings, which leads to a large amount of calculation and slow convergence. To overcome this drawback, the method with a faster convergence rate is used to solve the RSVM. Then, by using the idea of least square, a generalized exponentially robust LSSVM (ER-LSSVM) model is proposed, which is solved by the algorithm with a faster convergence rate. Moreover, the robustness of the ER-LSSVM is interpreted theoretically. Finally, ultilizing low-rank approximation of the kernel matrix, the sparse RSVM algorithm (SR-SVM) and sparse ER-LSSVM algorithm (SER-LSSVM) are proposed for handing large-scale problems. Many experimental results illustrate that the proposed algorithm outperforms the related algorithms in terms of convergence speed, test accuracy and training time.

Key words: robust support vector machines, nonconvex and smooth loss, sparse solution, low-rank approximation

中图分类号: 

  • TP391