A Compact and Language-Sensitive Multilingual Translation Method
Wang,Yining1,3; Zhou,Long1,3; Zhang,Jiajun1,3; Zhai,Feifei4; Xu,Jingfang4; Zong,Chengqing1,2,3
2019-07
会议日期July 28 - August 2, 2019
会议地点Florence, Italy
英文摘要

Multilingual neural machine translation (Multi-NMT) with one encoder-decoder model has made remarkable progress due to its simple deployment. However, this multilingual translation paradigm does not make full use of language commonality and parameter sharing between encoder and decoder. Furthermore, this kind of paradigm cannot outperform the individual models trained on bilingual corpus in most cases. In this paper, we propose a compact and language-sensitive
method for multilingual translation. To maximize parameter sharing, we first present a universal representor to replace both encoder and decoder models. To make the representor sensitive for specific languages, we further introduce language-sensitive embedding, attention, and discriminator with the ability to enhance model performance. We verify our methods on various translation scenarios, including one-to-many, many-to-many and zero-shot. Extensive experiments demonstrate that our proposed methods remarkably outperform strong standard multilingual translation systems on WMT and IWSLT datasets. Moreover, we find that our model is especially helpful in low-resource and zero-shot translation scenarios.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/39233]  
专题模式识别国家重点实验室_自然语言处理
通讯作者Zhang,Jiajun
作者单位1.National Laboratory of Pattern Recognition, CASIA, Beijing, China
2.CAS Center for Excellence in Brain Science and Intelligence Technology, Beijing, China
3.University of Chinese Academy of Sciences, Beijing, China
4.Sogou Inc., Beijing, China
推荐引用方式
GB/T 7714
Wang,Yining,Zhou,Long,Zhang,Jiajun,et al. A Compact and Language-Sensitive Multilingual Translation Method[C]. 见:. Florence, Italy. July 28 - August 2, 2019.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace