BriefGPT.xyz
Feb, 2019
动态稀疏重参数化实现深度卷积神经网络的参数高效训练
Parameter Efficient Training of Deep Convolutional Neural Networks by Dynamic Sparse Reparameterization
HTML
PDF
Hesham Mostafa, Xin Wang
TL;DR
介绍了一种新的动态稀疏重参数化方法,能够更有效地训练深度卷积神经网络,在固定的参数预算下达到最佳准确率,并发现在训练过程中探索结构自由度比增加额外的参数对网络性能的提升更为有效。
Abstract
Deep
neural networks
are typically highly over-parameterized with
pruning techniques
able to remove a significant fraction of network parameters with little loss in accuracy. Recently, techniques based on dynamic
→