BriefGPT.xyz
Jul, 2022
通过迭代修剪减少神经网络复杂度的DropNet
DropNet: Reducing Neural Network Complexity via Iterative Pruning
HTML
PDF
John Tan Chong Min, Mehul Motani
TL;DR
通过迭代剪枝最低激活值方法,提出DropNet来简化深度神经网络,实验结果表明,最高可减少90%的节点/滤波器而不影响精度,并证明DropNet与贪婪算法相似。
Abstract
Modern
deep neural networks
require a significant amount of computing time and power to train and deploy, which limits their usage on edge devices. Inspired by the iterative weight pruning in the Lottery Ticket Hypothesis, we propose
→