CORC  > 自动化研究所  > 中国科学院自动化研究所
Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture
Chen, C. L. Philip1,2,3; Liu, Zhulin3
刊名IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
2018
卷号29期号:1页码:10-24
关键词Big data big data modeling broad learning system (BLS) deep learning incremental learning random vector functional-link neural networks (RVFLNN) single layer feedforward neural networks (SLFN) singular value decomposition (SVD)
ISSN号2162-237X
DOI10.1109/TNNLS.2017.2716952
通讯作者Chen, C. L. Philip(philip.chen@ieee.org)
英文摘要Broad Learning System (BLS) that aims to offer an alternative way of learning in deep structure is proposed in this paper. Deep structure and learning suffer from a time-consuming training process because of a large number of connecting parameters in filters and layers. Moreover, it encounters a complete retraining process if the structure is not sufficient to model the system. The BLS is established in the form of a flat network, where the original inputs are transferred and placed as "mapped features" in feature nodes and the structure is expanded in wide sense in the "enhancement nodes." The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded. Two incremental learning algorithms are given for both the increment of the feature nodes (or filters in deep structure) and the increment of the enhancement nodes. The designed model and algorithms are very versatile for selecting a model rapidly. In addition, another incremental learning is developed for a system that has been modeled encounters a new incoming input. Specifically, the system can be remodeled in an incremental way without the entire retraining from the beginning. Satisfactory result for model reduction using singular value decomposition is conducted to simplify the final structure. Compared with existing deep neural networks, experimental results on the Modified National Institute of Standards and Technology database and NYU NORB object recognition dataset benchmark data demonstrate the effectiveness of the proposed BLS.
资助项目Macao Science and Technology Development Fund[019/2015/A1] ; UM Research Grants ; National Nature Science Foundation of China[61572540]
WOS关键词FUNCTIONAL-LINK NET ; NEURAL-NETWORKS ; ALGORITHM ; SELECTION ; REPRESENTATION ; APPROXIMATION ; REGRESSION ; SHRINKAGE
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000419558900002
资助机构Macao Science and Technology Development Fund ; UM Research Grants ; National Nature Science Foundation of China
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/28278]  
专题中国科学院自动化研究所
通讯作者Chen, C. L. Philip
作者单位1.Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100080, Peoples R China
2.Dalian Maritime Univ, Dalian 116026, Peoples R China
3.Univ Macau, Fac Sci & Technol, Dept Comp & Informat Sci, Macau 99999, Peoples R China
推荐引用方式
GB/T 7714
Chen, C. L. Philip,Liu, Zhulin. Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2018,29(1):10-24.
APA Chen, C. L. Philip,&Liu, Zhulin.(2018).Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,29(1),10-24.
MLA Chen, C. L. Philip,et al."Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 29.1(2018):10-24.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace