BriefGPT.xyz
Jun, 2023
ReLU网络的隐藏对称性
Hidden symmetries of ReLU networks
HTML
PDF
J. Elisenda Grigsby, Kathryn Lindsey, David Rolnick
TL;DR
研究了前馈ReLU神经网络架构的参数空间,证明了对于没有窄于输入层的任何网络架构,都存在没有隐藏对称性的参数设置,并通过实验近似计算了不同网络架构在初始化时的功能维度。
Abstract
The parameter space for any fixed architecture of
feedforward relu neural networks
serves as a proxy during training for the associated class of functions - but how faithful is this representation? It is known that many different
→