Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning
Kang, Xiaomian1,2; Zhao, Yang1,2; Zhang, Jiajun1,2,3; Zong, Chengqing1,2,4
2020-11
会议日期November 16–20, 2020
会议地点Online
关键词Docment-level NMT Neural Machine Translation Reinforcement Learning Context Selection
英文摘要

Document-level neural machine translation has yielded attractive improvements. However, majority of existing methods roughly use all context sentences in a fixed scope. They neglect the fact that different source sentences need different sizes of context. To address this problem, we propose an effective approach to select dynamic context so that the document-level translation model can utilize the more useful selected context sentences to produce better translations. Specifically, we introduce a selection module that is independent of the translation module to score each candidate context sentence. Then, we propose two strategies to explicitly select a variable number of context sentences and feed them into the translation module. We train the two modules end-to-end via reinforcement learning. A novel reward is proposed to encourage the selection and utilization of dynamic context sentences. Experiments demonstrate that our approach can select adaptive context sentences for different source sentences, and significantly improves the performance of document-level translation methods.

语种英语
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/44305]  
专题模式识别国家重点实验室_自然语言处理
作者单位1.National Laboratory of Pattern Recognition, Institute of Automation, CAS, Beijing, China
2.Beijing Academy of Artificial Intelligence, Beijing, China
3.CAS Center for Excellence in Brain Science and Intelligence Technology, Beijing, China
4.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Kang, Xiaomian,Zhao, Yang,Zhang, Jiajun,et al. Dynamic Context Selection for Document-level Neural Machine Translation via Reinforcement Learning[C]. 见:. Online. November 16–20, 2020.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace