BriefGPT.xyz
Apr, 2021
BNN - BN = ?:无需批量归一化训练二元神经网络
"BNN - BN = ?": Training Binary Neural Networks without Batch Normalization
HTML
PDF
Tianlong Chen, Zhenyu Zhang, Xu Ouyang, Zechun Liu, Zhiqiang Shen...
TL;DR
通过采用自适应梯度剪切、比例重量标准化和专门的瓶颈块等技术,本研究将BN-Free训练的框架扩展到二进制神经网络训练,并首次证明可以完全从BNN的训练和推断中去除BN层,而不会丧失性能。
Abstract
batch normalization
(BN) is a key facilitator and considered essential for state-of-the-art
binary neural networks
(BNN). However, the BN layer is costly to calculate and is typically implemented with non-binary
→