BriefGPT.xyz
May, 2021
神经机器翻译的选择性知识蒸馏
Selective Knowledge Distillation for Neural Machine Translation
HTML
PDF
Fusheng Wang, Jianhao Yan, Fandong Meng, Jie Zhou
TL;DR
该研究使用新方法对神经机器翻译及知识蒸馏的训练样本进行分析,提出了批级和全局级别的样本选择策略来优化知识蒸馏,实验结果表明,该方法在 WMT'14 英语->德语和 WMT'19 中文->英语机器翻译任务中提高了机器翻译的 BLEU 得分。
Abstract
neural machine translation
(NMT) models achieve state-of-the-art performance on many translation benchmarks. As an active research field in NMT,
knowledge distillation
is widely applied to enhance the model's per
→