BriefGPT.xyz
Jun, 2019
深度的ReLU网络具有惊人地少的激活模式
Deep ReLU Networks Have Surprisingly Few Activation Patterns
HTML
PDF
Boris Hanin, David Rolnick
TL;DR
本文研究表明,深度神经网络的理论表达能力与实际学习能力存在较大差距,即使在初始化和训练期间模型中的激活模式数量也呈现出一定限制,这可能限制了现有方法实现深度神经网络的全部表达能力。
Abstract
The success of
deep networks
has been attributed in part to their
expressivity
: per parameter,
deep networks
can approximate a richer clas
→