Inner Attention based Recurrent Neural Networks for Answer Selection | |
Wang Bingning; Liu Kang; Zhao Jun | |
2016 | |
会议日期 | 2016 |
会议地点 | 德国 |
关键词 | Answer Selection Question Answering Deep Learning |
卷号 | Volumn 1, Long Paper |
页码 | 1288-1297 |
英文摘要 | Attention based recurrent neural networks have shown advantages in representing natural language sentences (Hermann et al., 2015; Rocktäschel et al., 2015; Tan et al., 2015). Based on recurrent neural networks (RNN), external attention information was added to hidden representations to get an attentive sentence representation. Despite the improvement over non- attentive models, the attention mechanism under RNN is not well studied. In this work, we analyze the deficiency of traditional attention based RNN models quantitatively and qualitatively. Then we present three new RNN models that add attention information before RNN hidden representation, which shows advantage in representing sentence and achieves new state- of-art results in answer selection task. |
语种 | 英语 |
内容类型 | 会议论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/20182] |
专题 | 自动化研究所_模式识别国家重点实验室_自然语言处理团队 |
通讯作者 | Liu Kang |
作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | Wang Bingning,Liu Kang,Zhao Jun. Inner Attention based Recurrent Neural Networks for Answer Selection[C]. 见:. 德国. 2016. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论