BriefGPT.xyz
Aug, 2020
通过权重尺度不变正则化提高神经网络的泛化性能和鲁棒性
Improve Generalization and Robustness of Neural Networks via Weight Scale Shifting Invariant Regularizations
HTML
PDF
Ziquan Liu, Yufei Cui, Antoni B. Chan
TL;DR
该研究论文介绍了一种改进神经网络正则化器的方法,该正则化器不仅能对权重衰减,还能考虑权重尺度偏移对正则化的影响,有效地限制神经网络的内在范数,并优化了对抗强度以提高泛化性能。
Abstract
Using
weight decay
to penalize the L2 norms of weights in
neural networks
has been a standard training practice to regularize the complexity of networks. In this paper, we show that a family of regularizers, incl
→