Image classification based on convolutional neural networks with cross-level strategy
Liu, Yu1; Yin, Baocai1; Yu, Jun1; Wang, Zengfu1,2
刊名MULTIMEDIA TOOLS AND APPLICATIONS
2017-04-01
卷号76期号:8页码:11065-11079
关键词Convolutional Neural Networks (Cnns) Image Classification Network Architecture Feature Representation Deep Learning
DOI10.1007/s11042-016-3540-x
文献子类Article
英文摘要In the past few years, convolutional neural networks (CNNs) have exhibited great potential in the field of image classification. In this paper, we present a novel strategy named cross-level to improve the existing networks' architecture in which different levels of feature representation in a network are merely connected in series. The basic idea of cross-level is to establish a convolutional layer between two nonadjacent levels, aiming to extract more sufficient features with multiple scales at each feature representation level. The proposed cross-level strategy can be naturally integrated into an existing network without any change on its original architecture, which makes it very practical and convenient. Four popular convolutional networks for image classification are employed to illustrate its implementation in detail. Experimental results on the dataset adopted by the ImageNet Large-Scale Visual Recognition Challenge (ILSVRC) verify the effectiveness of the cross-level strategy on image classification. Furthermore, a new convolutional network with cross-level architecture is presented to demonstrate the potential of the proposed strategy in future network design.
WOS关键词REPRESENTATION ; RECOGNITION ; FEATURES ; SCALE
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000400570400043
资助机构National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Natural Science Foundation of China(61472393 ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; National Science and Technology Major Project of the Ministry of Science and Technology of China(2012GB102007) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; Anhui Province Initiative Funds on Intelligent Speech Technology and Industrialization(13Z02008) ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; IFLYTEK CO.,LTD. ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150) ; 61303150)
内容类型期刊论文
源URL[http://ir.hfcas.ac.cn:8080/handle/334002/33471]  
专题合肥物质科学研究院_中科院合肥智能机械研究所
作者单位1.Univ Sci & Technol China, Dept Automat, Hefei 230027, Peoples R China
2.Chinese Acad Sci, Inst Intelligent Machines, Hefei 230031, Peoples R China
推荐引用方式
GB/T 7714
Liu, Yu,Yin, Baocai,Yu, Jun,et al. Image classification based on convolutional neural networks with cross-level strategy[J]. MULTIMEDIA TOOLS AND APPLICATIONS,2017,76(8):11065-11079.
APA Liu, Yu,Yin, Baocai,Yu, Jun,&Wang, Zengfu.(2017).Image classification based on convolutional neural networks with cross-level strategy.MULTIMEDIA TOOLS AND APPLICATIONS,76(8),11065-11079.
MLA Liu, Yu,et al."Image classification based on convolutional neural networks with cross-level strategy".MULTIMEDIA TOOLS AND APPLICATIONS 76.8(2017):11065-11079.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace