BriefGPT.xyz
Nov, 2015
深度网络中的对称不变优化
Symmetry-invariant optimization in deep networks
HTML
PDF
Vijay Badrinarayanan, Bamdev Mishra, Roberto Cipolla
TL;DR
本文提出了两种基于对称不变梯度的权重更新方式,使用这些方式进行学习可以提高测试性能而不损失权重更新的计算效率。在MNIST数据集上,我们的实验证据表明这些更新有效,并且我们还展示了在图像分割问题上采用这些权重更新方法的训练结果。
Abstract
Recent works have highlighted scale invariance or
symmetry
that is present in the weight space of a typical deep network and the adverse effect that it has on the Euclidean gradient based
stochastic gradient descent
→