BriefGPT.xyz
Jul, 2019
通过多语言转移学习利用跨领域平行数据进行低资源神经机器翻译的开发
Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation
HTML
PDF
Aizhan Imankulova, Raj Dabre, Atsushi Fujita, Kenji Imamura
TL;DR
本文提出了一种新的多语种多阶段微调方法,融合了领域自适应、多语言和反向翻译,通过利用域外数据的转移学习方法来帮助改善日俄低资源神经机器翻译的质量。
Abstract
This paper proposes a novel
multilingual multistage fine-tuning
approach for
low-resource
neural machine translation
(NMT), taking a chall
→