Ian D. Jordan, Piotr Aleksander Sokol, Il Memming Park
TL;DR使用连续时间分析,我们对Gated recurrent units (GRUs)的内部运作获得了直观的理解。我们发现了一些意想不到的动态特征,同时我们无法训练GRU网络产生连续的吸引子,这也是生物神经网络存在的假设。
Abstract
Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success in natural language, speech, and video processing, little is understood about the specific →