BriefGPT.xyz
Jul, 2017
循环神经网络的贝叶斯稀疏化
Bayesian Sparsification of Recurrent Neural Networks
HTML
PDF
Ekaterina Lobacheva, Nadezhda Chirkova, Dmitry Vetrov
TL;DR
本文通过采用稀疏变分dropout技术和二元变分dropout技术,对循环神经网络进行稀疏化处理,并在情感分析和语言建模任务中取得了较高的稀疏度和较低的信息损失。
Abstract
recurrent neural networks
show state-of-the-art results in many text analysis tasks but often require a lot of memory to store their weights. Recently proposed
sparse variational dropout
eliminates the majority o
→