BriefGPT.xyz
Jan, 2018
强化自注意力网络:硬注意力和软注意力的序列建模混合体
Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling
HTML
PDF
Tao Shen, Tianyi Zhou, Guodong Long, Jing Jiang, Sen Wang...
TL;DR
本文提出了一种新的强化自注意力(ReSA)模型,它将软注意力和硬注意力结合在一起,通过引入强化序列采样(RSS)和奖励信号,有效且高效地提取长句子中的稀疏依赖关系,且在SNLI和SICK数据集上达到最优表现。
Abstract
Many
natural language processing
tasks solely rely on sparse dependencies between a few tokens in a sentence. Soft
attention mechanisms
show promising performance in modeling local/global dependencies by soft pro
→