BriefGPT.xyz
Feb, 2018
基于解码历史的自适应控制注意力的神经机器翻译
Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation
HTML
PDF
Junyang Lin, Shuming Ma, Qi Su, Xu Sun
TL;DR
提出了一种基于解码历史的自适应注意力控制机制,可以解决注意力在缺乏解码历史信息的情况下导致频繁重复的问题,在中英文和英越翻译任务上均表现出较高的翻译准确性,可生成较少重复的翻译。
Abstract
attention-based sequence-to-sequence model
has proved successful in
neural machine translation
(NMT). However, the attention without consideration of decoding history, which includes the past information in the d
→