BriefGPT.xyz
Jul, 2017
具有缩放Cayley变换的正交循环神经网络
Orthogonal Recurrent Neural Networks with Scaled Cayley Transform
HTML
PDF
Kyle Helfrich, Devin Willmott, Qiang Ye
TL;DR
本文提出了一种使用斜对称矩阵参数化 Cayley 变换来维持正交循环权重矩阵并克服负特征值限制的更新方案,在多个实验中证明了比其他单元 RNN 的表现更优秀,需要的可训练参数更少。
Abstract
recurrent neural networks
(RNNs) are designed to handle sequential data but suffer from vanishing or exploding gradients. Recent work on Unitary
recurrent neural networks
(uRNNs) have been used to address this is
→