BriefGPT.xyz
Mar, 2016
无记忆丢失的循环失活
Recurrent Dropout without Memory Loss
HTML
PDF
Stanislau Semeniuta, Aliaksei Severyn, Erhardt Barth
TL;DR
本文提出一种新的循环神经网络正则化方法,通过在 extit{循环} 链接中直接删除神经元来实现,并且不会丢失长期记忆,实验证明,该方法即使与传统的前馈dropout相结合,也能在自然语言处理基准测试中取得一致的改进。
Abstract
This paper presents a novel approach to
recurrent neural network
(RNN) regularization. Differently from the widely adopted
dropout
method, which is applied to forward connections of feed-forward architectures or
→