BriefGPT.xyz
Jun, 2020
GradAug: 深度神经网络的一种新正则化方法
GradAug: A New Regularization Method for Deep Neural Networks
HTML
PDF
Taojiannan Yang, Sijie Zhu, Chen Chen
TL;DR
提出了一种新的正则化方法(GradAug),该方法通过对神经网络进行宽度采样并使用随机变换的样本来对子网络进行正则化,从而引入自我引导的干扰,并具有很好的泛化性能,在图像分类、目标检测和实例分割方面表现出最佳结果。
Abstract
We propose a new
regularization
method to alleviate over-fitting in
deep neural networks
. The key idea is utilizing randomly transformed training samples to regularize a set of sub-networks, which are originated
→