BriefGPT.xyz
Feb, 2018
正则化压缩神经网络的学习
Learning Compact Neural Networks with Regularization
HTML
PDF
Samet Oymak
TL;DR
研究了深度神经网络的正则梯度下降算法,并通过量化约束集合的复杂度以及研究覆盖维度来探索正则化技术在加速训练、提高泛化性能以及学习更高效紧凑模型方面的优势。
Abstract
We study the impact of
regularization
for learning
neural networks
. Our goal is speeding up training, improving
generalization
performance
→