CORC  > 厦门大学  > 信息技术-已发表论文
A Dynamic AdaBoost Algorithm With Adaptive Changes of Loss Function
Gao, Yunlong ; Ji, Guoli ; Yang, Zijiang ; Pan, Jinyan ; Gao YL(高云龙) ; Ji GL(吉国力)
刊名http://dx.doi.org/10.1109/TSMCC.2012.2227471
2012-11
关键词STATISTICAL VIEW CLASSIFIERS REGRESSION ENSEMBLE CLASSIFICATION MARGINS
英文摘要National Natural Science Foundation of China [61203176, 61174161]; Key Research Project of Fujian Province of China [2009H0044]; Fundamental Research Funds for the Central Universities in China, Xiamen University [2011121047201112G018, CXB2011035]; Natural Sciences and Engineering Research Council of Canada; AdaBoost is a method to improve a given learning algorithm's classification accuracy by combining its hypotheses. Adaptivity, one of the significant advantages of AdaBoost, makes AdaBoost maximize the smallest margin so that AdaBoost has good generalization ability. However, when the samples with large negative margins are noisy or atypical, the maximized margin is actually a "hard margin." The adaptive feature makes AdaBoost sensitive to the sampling fluctuations, and prone to overfitting. Therefore, the traditional schemes prevent AdaBoost from overfitting by heavily damping the influences of samples with large negative margins. However, the samples with large negative margins are not always noisy or atypical; thus, the traditional schemes of preventing overfitting may not be reasonable. In order to learn a classifier with high generalization performance and prevent overfitting, it is necessary to perform statistical analysis for the margins of training samples. Herein, Hoeffding inequality is adopted as a statistical tool to divide training samples into reliable samples and temporary unreliable samples. A new boosting algorithm, which is named DAdaBoost, is introduced to deal with reliable samples and temporary unreliable samples separately. Since DAdaBoost adjusts weighting scheme dynamically, the loss function of DAdaBoost is not fixed. In fact, it is a series of nonconvex functions that gradually approach the 0-1 function as the algorithm evolves. By defining a virtual classifier, the dynamic adjusted weighting scheme is well unified into the progress of DAdaBoost, and the upper bound of training error is deduced. The experiments on both synthetic and real world data show that DAdaBoost has many merits. Based on the experiments, we conclude that DAdaBoost can effectively prevent AdaBoost from overfitting.
语种英语
出版者IEEE T SYST MAN CY C
内容类型期刊论文
源URL[http://dspace.xmu.edu.cn/handle/2288/92369]  
专题信息技术-已发表论文
推荐引用方式
GB/T 7714
Gao, Yunlong,Ji, Guoli,Yang, Zijiang,et al. A Dynamic AdaBoost Algorithm With Adaptive Changes of Loss Function[J]. http://dx.doi.org/10.1109/TSMCC.2012.2227471,2012.
APA Gao, Yunlong,Ji, Guoli,Yang, Zijiang,Pan, Jinyan,高云龙,&吉国力.(2012).A Dynamic AdaBoost Algorithm With Adaptive Changes of Loss Function.http://dx.doi.org/10.1109/TSMCC.2012.2227471.
MLA Gao, Yunlong,et al."A Dynamic AdaBoost Algorithm With Adaptive Changes of Loss Function".http://dx.doi.org/10.1109/TSMCC.2012.2227471 (2012).
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace