Deep Unbiased Embedding Transfer for Zero-Shot Learning
Jia, Zhen2; Zhang, Zhang2; Wang, Liang1,2,3; Shan, Caifeng4; Tan, Tieniu1,2,3
刊名IEEE TRANSACTIONS ON IMAGE PROCESSING
2020
卷号29期号:29页码:1958-1971
关键词Visualization Feature extraction Semantics Training Seals Prototypes Indexes Zero-shot learning image classification projection domain shift convolutional neural network generative adversarial network
ISSN号1057-7149
DOI10.1109/TIP.2019.2947780
英文摘要

Zero-shot learning aims to recognize objects which do not appear in the training dataset. Previous prevalent mapping-based zero-shot learning methods suffer from the projection domain shift problem due to the lack of image classes in the training stage. In order to alleviate the projection domain shift problem, a deep unbiased embedding transfer (DUET) model is proposed in this paper. The DUET model is composed of a deep embedding transfer (DET) module and an unseen visual feature generation (UVG) module. In the DET module, a novel combined embedding transfer net which integrates the complementary merits of the linear and nonlinear embedding mapping functions is proposed to connect the visual space and semantic space. Whats more, the end-to-end joint training process is implemented to train the visual feature extractor and the combined embedding transfer net simultaneously. In the UVG module, a visual feature generator trained with a conditional generative adversarial framework is used to synthesize the visual features of the unseen classes to ease the disturbance of the projection domain shift problem. Furthermore, a quantitative index, namely the score of resistance on domain shift (ScoreRDS), is proposed to evaluate different models regarding their resistance capability on the projection domain shift problem. The experiments on five zero-shot learning benchmarks verify the effectiveness of the proposed DUET model. As demonstrated by the qualitative and quantitative analysis, the unseen class visual feature generation, the combined embedding transfer net and the end-to-end joint training process all contribute to alleviating projection domain shift in zero-shot learning.

资助项目Natural Science Foundation of China[61721004] ; National Key R&D Program of China[2016YFB1001000] ; Natural Science Foundation of China[61525306] ; Natural Science Foundation of China[61633021]
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000501324900023
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/29391]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Zhang, Zhang
作者单位1.Univ Chinese Acad Sci, Beijing 100190, Peoples R China
2.Chinese Acad Sci, Natl Lab Pattern Recognit, Ctr Res Intelligent Percept & Comp, Inst Automat, Beijing 100190, Peoples R China
3.CAS Ctr Excellence Brain Sci & Intelligence Techn, Beijing 100190, Peoples R China
4.Chinese Acad Sci, AIR, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Jia, Zhen,Zhang, Zhang,Wang, Liang,et al. Deep Unbiased Embedding Transfer for Zero-Shot Learning[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2020,29(29):1958-1971.
APA Jia, Zhen,Zhang, Zhang,Wang, Liang,Shan, Caifeng,&Tan, Tieniu.(2020).Deep Unbiased Embedding Transfer for Zero-Shot Learning.IEEE TRANSACTIONS ON IMAGE PROCESSING,29(29),1958-1971.
MLA Jia, Zhen,et al."Deep Unbiased Embedding Transfer for Zero-Shot Learning".IEEE TRANSACTIONS ON IMAGE PROCESSING 29.29(2020):1958-1971.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace