BriefGPT.xyz
Oct, 2022
拟均匀神经网络的非对称最大余量偏差
The Asymmetric Maximum Margin Bias of Quasi-Homogeneous Neural Networks
HTML
PDF
Daniel Kunin, Atsushi Yamamura, Chao Ma, Surya Ganguli
TL;DR
研究探讨了梯度流在指数损失的条件下,拟半齐次神经网络的最大边际偏差,发现梯度流隐式地偏爱一部分参数,但可能会降低拟半齐次模型的鲁棒性,并分析了模型简化的机制,最后揭示了神经崩溃的普适性现象。
Abstract
In this work, we explore the
maximum-margin bias
of
quasi-homogeneous neural networks
trained with
gradient flow
on an exponential loss an
→