BriefGPT.xyz
Apr, 2019
零样本神经机器翻译中一致性的达成
Consistency by Agreement in Zero-shot Neural Machine Translation
HTML
PDF
Maruan Al-Shedivat, Ankur P. Parikh
TL;DR
通过将多语言翻译问题重新构造为概率推理,定义了零-shot一致性的概念;引入了一种基于一致性约束的训练方法,鼓励模型在辅助语言中生成等效的平行句子翻译,最终我们测试了多种公共的零-shot翻译基准数据集,并证明基于一致性约束训练的NMT模型通常会在无监督翻译任务上取得2-3 BLEU的提高,而在监督翻译任务上的性能不会降低。
Abstract
Generalization and reliability of
multilingual translation
often highly depend on the amount of available parallel data for each language pair of interest. In this paper, we focus on
zero-shot generalization
---a
→