batch normalization (BN) is extensively employed in various network
architectures by performing standardization within mini-batches.
A full understanding of the process has been a central target in the deep
learning communities.
Unlike existing works, which usually only analyze the sta
本文介绍了一种名为 Group Whitening 的新型批量归一化方法,该方法结合了白化方法和 Group Normalization 的优点,避免了普通批量归一化的缺点,并且从模型表征容量的角度,分析了批归一化的表征能力与批大小(组数)的关系,通过在 ResNet 和 ResNeXt 上的实验,验证了 Group Whitening 在不同架构中的性能优势。