Neural Encoding and Decoding With Distributed Sentence Representations
jingyuan sun1,3; shaonan wang1,3; jiajun zhang1,3; chengqing zong1,2,3
刊名IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
2020
卷号0期号:0页码:0
关键词brain-machine interface, distributed semantic representations, neural decoding, neural encoding
ISSN号2162-237X
DOI10.1109/TNNLS.2020.3027595
通讯作者Wang, Shaonan(shaonan.wang@nlpr.ia.ac.cn)
英文摘要

Building computational models to account for the cortical representation of language plays an important role in understanding the human linguistic system. Recent progress in Distributed Semantic Model (DSM), especially the Transformer- based models, has advances in many language understanding tasks, making DSM a promising methodology to probe brain language processing. DSMs have been shown to reliably explain cortical responses to word stimuli. However, characterizing the brain activities for sentence processing is much less exhaustively explored with DSMs, especially the deep neural network-based methods. What is the relationship between cortical sentence representations against DSMs? What linguistic features a DSM catches better explain its correlation with the brain activities aroused by sentence stimuli? Could distributed sentence repre- sentations help to reveal the semantic selectivity of different brain areas? We address these questions through the lens of neural encoding and decoding, fueled by the latest developments in natural language representation learning. We begin by evaluating a wide range of 12 DSMs on predicting and deciphering the func- tional magnetic resonance images (fMRI) from humans reading sentences. Most models are found to deliver high accuracy in the left middle temporal gyrus (LMTG) and left occipital complex (LOC). Notably, encoders trained with Transformer-based DSMs consistently outperform other unsupervised structured models and all the unstructured baselines. We then design probing and ablation tasks to explain what matters for the DSMs to perform well in predicting and deciphering the brain activities. We find features that significantly influence a DSM’s perfor- mance is highly diverse across different ROIs, and not uniform along different models. We also illustrate DSM’s selectivity to different concept categories and divide the cortical semantic system with topic-labeled parcels. Our results corroborate and extend previous findings in understanding the relation between DSMs and neural activation patterns as well as contribute to building solid brain-machine interfaces with deep neural network representations.

资助项目Natural Science Foundation of China[61906189] ; Beijing Municipal Science and Technology Project[Z181100008918017] ; Beijing Advanced Innovation Center for Language Resources ; Beijing Academy of Artificial Intelligence[BAAI2019QN0504]
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000616310400011
资助机构Natural Science Foundation of China ; Beijing Municipal Science and Technology Project ; Beijing Advanced Innovation Center for Language Resources ; Beijing Academy of Artificial Intelligence
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/40572]  
专题模式识别国家重点实验室_自然语言处理
通讯作者shaonan wang
作者单位1.National Laboratory of Pattern Recognition, CASIA, Beijing, China
2.University of Chinese Academy of Sciences, Beijing, China
3.CAS Center for Excellence in Brain Science and Intelligence Technology, Beijing, China
推荐引用方式
GB/T 7714
jingyuan sun,shaonan wang,jiajun zhang,et al. Neural Encoding and Decoding With Distributed Sentence Representations[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2020,0(0):0.
APA jingyuan sun,shaonan wang,jiajun zhang,&chengqing zong.(2020).Neural Encoding and Decoding With Distributed Sentence Representations.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,0(0),0.
MLA jingyuan sun,et al."Neural Encoding and Decoding With Distributed Sentence Representations".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 0.0(2020):0.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace