BriefGPT.xyz
May, 2016
神经机器翻译覆盖嵌入模型
A Coverage Embedding Model for Neural Machine Translation
HTML
PDF
Haitao Mi, Baskaran Sankaran, Zhiguo Wang, Abe Ittycheriah
TL;DR
本文提出了一种加入显式覆盖嵌入模型来改善注意力机制的神经机器翻译系统,以解决翻译中的重复和遗漏问题。通过在大规模中英翻译任务中的实验,发现该模型相较于大词汇NMT系统,在不同的测试集上显著地提高了翻译质量。
Abstract
In this paper, we enhance the
attention-based neural machine translation
by adding an explicit coverage embedding model to alleviate issues of
repeating and dropping translations
in NMT. For each source word, our
→