BriefGPT.xyz
Dec, 2020
重新审视鲁棒神经机器翻译:基于Transformer的案例研究
Revisiting Robust Neural Machine Translation: A Transformer Case Study
HTML
PDF
Peyman Passban, Puneeth S. M. Saladi, Qun Liu
TL;DR
本文介绍了一种名为TAFT的数据驱动技术,它利用fine-tuning策略加入噪声训练Transformer模型,并提出了两种新型技术CD和DCD以帮助模型更好地处理噪声,最终在英德翻译语料中实现更高的鲁棒性。
Abstract
transformers
(Vaswani et al., 2017) have brought a remarkable improvement in the performance of
neural machine translation
(NMT) systems, but they could be surprisingly vulnerable to
→