BriefGPT.xyz
Oct, 2018
循环注意力单元
Recurrent Attention Unit
HTML
PDF
Guoqiang Zhong, Guohua Yue, Xiao Ling
TL;DR
本文提出一种名为Recurrent Attention Unit的循环神经网络模型,它将注意机制融入了GRU的内部结构中并通过增加attention gate提高了GRU对于长期记忆的能力,对于序列数据能够通过自适应选择序列的区域或位置并在学习过程中更加关注选定的区域,实验结果表明RAU在图像分类、情感分类和语言建模等方面均优于GRU和其他基线方法。
Abstract
recurrent neural network
(RNN) has been successfully applied in many
sequence learning
problems. Such as handwriting recognition, image description, natural language processing and video motion analysis. After ye
→