针对传统支持向量机对噪声敏感的问题,给出一种基于不对称形式的二次不敏感控制型\,ramp\,损失函数的支持向量回归机,采用凹凸过程优化和光滑技术算法,将非凸优化问题转化为连续且二次可微的凸优化问题,利用有限步终止的\,Amijo-Newton\,优化算法,求解所建立的优化模型,并分析了算法的收敛性.该算法不仅可以保持支持向量的稀疏性,而且还可以控制训练样本中的异常值.实验结果表明,该模型保持了很好的泛化能力,无论对模拟数据还是标准数据都具有一定的拟合精度,与标准支持向量机模型相比,不仅能够降低噪声和孤立点的影响,而且也具有较强的鲁棒性.
袁玉萍
,
安增龙
. 基于Ramp损失函数的原空间支持向量回归机[J]. 华东师范大学学报(自然科学版), 2016
, 2016(2)
: 20
-29
.
DOI: 2016.02.003
Aiming at the problem of standard support vector machine being sensitive to the noise, a new method of support vector regression (SVR) machine based on dissymmetry quadratic and controlled-insensitive loss function is proposed. Using the concave and convex process optimization and the smooth technology algorithm, the problem of non-convex optimization is transformed into the problem of the continuous and twice differentiable convex optimization. Using the Amijo-Newton optimized algorithm of finite iteration termination, the established optimization model is solved, and the convergence of the algorithm is analyzed. The algorithm can not only keep the sparse nature of support vector, but also control the abnormal values of the training sample. The results of theexperiment showed that the support vector regression machine model proposed kept good generalization ability, and the model could fit better both the simulated data and the standard data. Compared with the standard support vector machine (SVM) model, the proposed model not only can reduce the effects of noise and outliers, but also has stronger robustness.
[1]XIU F J, ZHANG Y, JIAN C L. Fuzzy SVM with a new fuzzy membership function [J]. Neural Computing and Application, 2006 (15): 268-276.
[2]LIU Y H, CHEN Y T. Face recongnition using total margin-based adaptive fuzzy support vector machines [J]. IEEE Trans on Neural Networks, 2007, 18(1): 178-192.
[3]YU S, YANG X W, HAO Z F, et al.An adaptive support vector machinelearning algorithm for large classification problem [J]. Lecture Notes in Computer Science, 2006, 3971: 981-990.
[4]LIN C F, WANG S D. Fuzzy support vector machines [J]. IEEE Transactions on Neural Networks, 2002, 3(2): 464-471.
[5]JIN B, ZHANG Y Q.Classifying very large data sets with minimum enclosing ban based support vector machine [C]//Proceedings of the 2006 IEEE International Conference on Fuzzy Systems. Vancouver BC, 2006: 364-368.
[6]BO L, WANG L, JIAO L. Recursive finite Newton algorithm for support vector regression in the primal [J]. Neural Computation, 2007,19(4): 1082-1096.
[7]CHEN X B, YANG J, LIANG J, et al. Recursive robust least squares support vector regression based on maximum correntropy criterion [J]. Neurocomputing, 2012, 97: 63-73.
[8]杨俊燕, 张优云, 朱永生. 不敏感损失函数支持向量机分类性能研究~[J].西安交通大学学报, 2007, 4l(11): 1315-1320.
[9]HUANG H, LIU Y. Fuzzy support vector machines for pattern recognition and data mining [J]. International Journal of Fuzzy Systems, 2002, 4(3): 3-12.
[10]WANG L, JIA H, LI J. Training robust support vector machine with smooth ramp loss in the primal space [J] Neurocomputing, 2008, 71(13/14/15): 3020-3025.
[11]ZHAO Y, SUN J. Robust support vector regression in the primal [J]. Neural Networks, 2008, 21(10): 1548-1555.
[12]YANG S H, HU B G. A stagewise least square loss function for classfication [C]//Proceedings of the SIAM International Conference on Data Mining. 2008, 120-131.
[13]KIMELDORF G S, WAHBA G. A correspondence between Bayesian estimation on stochastic processes and smoothing by splines [J]. Annals of Mathematical Statistics, 1970, 41(2): 495-502.
[14]YUILLE A L, RANGARAJAN A. The concave-convex procedure (CCCP) [J]. Neural Computation, 2003, 15(4): 915-936.
[14] FUNG G, MANGASARIAN O L. Finite Newton method for Lagrangiansupport vector machine classification [J]. Neurocomputing, 2003,55(1/2): 39-55.