留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于指数Laplace损失函数的回归估计鲁棒超限学习机

王快妮 曹进德 刘庆山

王快妮, 曹进德, 刘庆山. 基于指数Laplace损失函数的回归估计鲁棒超限学习机[J]. 应用数学和力学, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240
引用本文: 王快妮, 曹进德, 刘庆山. 基于指数Laplace损失函数的回归估计鲁棒超限学习机[J]. 应用数学和力学, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240
WANG Kuaini, CAO Jinde, LIU Qingshan. An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240
Citation: WANG Kuaini, CAO Jinde, LIU Qingshan. An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240

基于指数Laplace损失函数的回归估计鲁棒超限学习机

doi: 10.21656/1000-0887.400240
基金项目: 国家自然科学基金(61833005;61907033);中国博士后科学基金(2018M642129)
详细信息
    作者简介:

    王快妮(1982—),女,讲师,博士(E-mail: wangkuaini1219@sina.com);曹进德(1963—),男,教授,博士,博士生导师(通讯作者. E-mail: jdcao@seu.edu.cn).

  • 中图分类号: TP181

An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation

Funds: The National Natural Science Foundation of China(61833005;61907033);China Postdoctoral Science Foundation(2018M642129)
  • 摘要: 实际问题的数据集通常受到各种噪声的影响,超限学习机(extreme learning machine, ELM)对这类数据集进行学习时,表现出预测精度低、预测结果波动大.为了克服该缺陷,采用了能够削弱噪声影响的指数Laplace损失函数.该损失函数是建立在Gauss核函数基础上,具有可微、非凸、有界且能够趋近于Laplace函数的特点.将其引入到超限学习机中,提出了鲁棒超限学习机回归估计(exponential Laplace loss function based robust ELM for regression, ELRELM)模型.利用迭代重赋权算法求解模型的优化问题.在每次迭代中,噪声样本点被赋予较小的权值,能够有效地提高预测精度.真实数据集实验验证了所提出的模型相比较于对比算法具有更优的学习性能和鲁棒性.
  • [1] HUANG G, ZHU Q, SIEW C. Extreme learning machine: a new learning scheme of feedforward neural networks[C]// 2004 IEEE International Joint Conference on Neural Networks . Budapest, Hungary, 2004: 985-990.
    [2] HUANG G, ZHU Q, SIEW C. Extreme learning machine: theory and applications[J]. Neurocomputing,2006,70(1/3): 489-501.
    [3] HUANG G, ZHOU H, DING X, et al. Extreme learning machine for regression and multiclass classification[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics),2012,42(2): 513-529.
    [4] 姜琳颖, 余东海, 石鑫. 基于加权极限学习机的肿瘤基因表达谱数据分类[J]. 东北大学学报(自然科学版), 2017,38(6): 798-803.(JIANG Linying, YU Donghai, SHI Xin. Tumor microarray gene expression data classification based on weighted extreme learning machine[J]. Journal of Northeastern University (Natural Science),2017,38(6): 798-803.(in Chinese))
    [5] 柯逍, 邹嘉伟, 杜明智, 等. 基于蒙特卡罗数据集均衡与鲁棒性增量极限学习机的图像自动标注[J]. 电子学报, 2017,45(12): 2925-2935.(KE Xiao, ZOU Jiawei, DU Mingzhi, et al. The automatic image annotation based on monte-carlo data set balance and robustness incremental extreme learning machine[J]. Acta Electronica Sinica,2017,45(12): 2925-2935.(in Chinese))
    [6] HAMPEL F, RONCHETTI E, ROUSSEEUW P, et al. Robust Statistics [M]. Wiley, 2005.
    [7] FRENAY B, VERLEYSEN M. Classification in the presence of label noise: a survey[J]. IEEE Transactions on Neural Networks and Learning Systems,2014,25(5): 845-869.
    [8] DENG W, ZHENG Q, CHEN L. Regularized extreme learning machine[C]// 2009 IEEE Symposium on Computational Intelligence and Data Mining . Nashville, TN, USA, 2009: 389-395.
    [9] HORATA P, CHIEWCHANWATTANA S, SUNAT K. Robust extreme learning machine[J]. Neurocomputing,2013,102: 31-44.
    [10] ZHANG K, LUO M. Outlier-robust extreme learning machine for regression problems[J]. Neurocomputing,2015,151(3): 1519-1527.
    [11] REN Z, YANG L. Robust extreme learning machines with different loss functions[J]. Neural Processing Letters,2019,49(3): 1543-1565.
    [12] CHEN K, L Q, LU Y, et al. Robust regularized extreme learning machine for regression using iteratively reweighted least squares[J]. Neurocomputing,2017,230: 345-358.
    [13] WANG L, JIA H, LI J. Training robust support vector machine with smooth ramp loss in the primal space[J]. Neurocomputing,2008,71(13/15): 3020-3025.
    [14] SHEN X, NIU L, QI Z, et al. Support vector machine classifier with truncated pinball loss[J]. Pattern Recognition,2017,68: 199-210.
    [15] ZHONG P. Training robust support vector regression with smooth non-convex loss function[J]. Optimization Methods & Software,2012,27(6): 1039-1058.
    [16] JIANG W, NIE F, HUANG H. Robust dictionary learning with capped l1 norm[C]// Twenty-Fourth International Joint Conference on Artificial Intelligence . Buenos Aires, Argentina, 2015.
    [17] LIU W, POKHAREL P, PRINCIPE J. Correntropy: properties and applications in non-Gaussian signal processing[J]. IEEE Transactions on Signal Processing,2007,55(11): 5286-5298.
    [18] HE R, HU B, YUAN X,et al. Robust Recognition Via Information Theoretic Learning [M]. Springer, 2014.
    [19] XING H, WANG X. Training extreme learning machine via regularized correntropy criterion[J]. Neural Computing and Applications,2013,23(7/8): 1977-1986.
    [20] SINGH A, POKHAREL R, PRINCIPE J. The C-loss function for pattern classification[J]. Pattern Recognition,2014,47(1): 441-453.
    [21] FENG Y, YANG Y, HUANG X, et al. Robust support vector machines for classification with nonconvex and smooth losses[J]. Neural Computation,2016,28(6): 1217-1247.
    [22] YANG L, REN Z, WANG Y,et al. A robust regression framework with Laplace kernel-induced loss[J]. Neural Computation,2017,29(11): 3014-3039.
  • 加载中
计量
  • 文章访问数:  1214
  • HTML全文浏览量:  190
  • PDF下载量:  533
  • 被引次数: 0
出版历程
  • 收稿日期:  2019-08-20
  • 刊出日期:  2019-11-01

目录

    /

    返回文章
    返回