BriefGPT.xyz
Jan, 2020
在BN反向传播中稳定批统计量
Towards Stabilizing Batch Statistics in Backward Propagation of Batch Normalization
HTML
PDF
Junjie Yan, Ruosi Wan, Xiangyu Zhang, Wei Zhang, Yichen Wei...
TL;DR
本文提出一种新的归一化方法,即移动平均批量归一化(MABN),可以在小批量情况下完全恢复基本BN的性能,并且在推理过程中不需要引入任何额外的非线性操作,此方法通过理论分析和实验演示了其有效性。
Abstract
batch normalization
(BN) is one of the most widely used techniques in Deep Learning field. But its
performance
can awfully degrade with insufficient batch size. This weakness limits the usage of BN on many
→