中国综合性科技类核心期刊(北大核心)

中国科学引文数据库来源期刊(CSCD)

美国《化学文摘》(CA)收录

美国《数学评论》(MR)收录

俄罗斯《文摘杂志》收录

Message Board

Respected readers, authors and reviewers, you can add comments to this page on any questions about the contribution, review, editing and publication of this journal. We will give you an answer as soon as possible. Thank you for your support!

Name
E-mail
Phone
Title
Content
Verification Code
Issue 2
Jul.  2016
Turn off MathJax
Article Contents
YUAN Yu-Ping, AN Zeng-Long. Support vector machine in the primal space based on\\ the ramp loss function\\[6mm[J]. Journal of East China Normal University (Natural Sciences), 2016, (2): 20-29. doi: 2016.02.003
Citation: YUAN Yu-Ping, AN Zeng-Long. Support vector machine in the primal space based on\\ the ramp loss function\\[6mm[J]. Journal of East China Normal University (Natural Sciences), 2016, (2): 20-29. doi: 2016.02.003

Support vector machine in the primal space based on\\ the ramp loss function\\[6mm

doi: 2016.02.003
  • Received Date: 2015-03-04
  • Publish Date: 2016-03-25
  • Aiming at the problem of standard support vector machine being sensitive to the noise, a new method of support vector regression (SVR) machine based on dissymmetry quadratic and controlled-insensitive loss function is proposed. Using the concave and convex process optimization and the smooth technology algorithm, the problem of non-convex optimization is transformed into the problem of the continuous and twice differentiable convex optimization. Using the Amijo-Newton optimized algorithm of finite iteration termination, the established optimization model is solved, and the convergence of the algorithm is analyzed. The algorithm can not only keep the sparse nature of support vector, but also control the abnormal values of the training sample. The results of theexperiment showed that the support vector regression machine model proposed kept good generalization ability, and the model could fit better both the simulated data and the standard data. Compared with the standard support vector machine (SVM) model, the proposed model not only can reduce the effects of noise and outliers, but also has stronger robustness.
  • loading
  • [1]XIU F J, ZHANG Y, JIAN C L. Fuzzy SVM with a new fuzzy membership function [J]. Neural Computing and Application, 2006 (15): 268-276.
    [2]LIU Y H, CHEN Y T. Face recongnition using total margin-based adaptive fuzzy support vector machines [J]. IEEE Trans on Neural Networks, 2007, 18(1): 178-192.
    [3]YU S, YANG X W, HAO Z F, et al.An adaptive support vector machinelearning algorithm for large classification problem [J]. Lecture Notes in Computer Science, 2006, 3971: 981-990.
    [4]LIN C F, WANG S D. Fuzzy support vector machines [J]. IEEE Transactions on Neural Networks, 2002, 3(2): 464-471.
    [5]JIN B, ZHANG Y Q.Classifying very large data sets with minimum enclosing ban based support vector machine [C]//Proceedings of the 2006 IEEE International Conference on Fuzzy Systems. Vancouver BC, 2006: 364-368.
    [6]BO L, WANG L, JIAO L. Recursive finite Newton algorithm for support vector regression in the primal [J]. Neural Computation, 2007,19(4): 1082-1096.
    [7]CHEN X B, YANG J, LIANG J, et al. Recursive robust least squares support vector regression based on maximum correntropy criterion [J]. Neurocomputing, 2012, 97: 63-73.
    [8]杨俊燕, 张优云, 朱永生. 不敏感损失函数支持向量机分类性能研究~[J].西安交通大学学报, 2007, 4l(11): 1315-1320.
    [9]HUANG H, LIU Y. Fuzzy support vector machines for pattern recognition and data mining [J]. International Journal of Fuzzy Systems, 2002, 4(3): 3-12.
    [10]WANG L, JIA H, LI J. Training robust support vector machine with smooth ramp loss in the primal space [J] Neurocomputing, 2008, 71(13/14/15): 3020-3025.
    [11]ZHAO Y, SUN J. Robust support vector regression in the primal [J]. Neural Networks, 2008, 21(10): 1548-1555.
    [12]YANG S H, HU B G. A stagewise least square loss function for classfication [C]//Proceedings of the SIAM International Conference on Data Mining. 2008, 120-131.
    [13]KIMELDORF G S, WAHBA G. A correspondence between Bayesian estimation on stochastic processes and smoothing by splines [J]. Annals of Mathematical Statistics, 1970, 41(2): 495-502.
    [14]YUILLE A L, RANGARAJAN A. The concave-convex procedure (CCCP) [J]. Neural Computation, 2003, 15(4): 915-936.
    [14] FUNG G, MANGASARIAN O L. Finite Newton method for Lagrangiansupport vector machine classification [J]. Neurocomputing, 2003,55(1/2): 39-55.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索
    Article views (522) PDF downloads(808) Cited by()
    Proportional views

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return