BriefGPT.xyz
Dec, 2018
层次专家混合中的辍学正则化
Dropout Regularization in Hierarchical Mixture of Experts
HTML
PDF
Ozan İrsoy, Ethem Alpaydın
TL;DR
本研究提出了一种Hierarchical mixture of experts的dropout变种,用于防止多层神经网络中的过拟合,可以在许多层级的树上预防overfitting,从而提高泛化性能和提供更平滑的拟合。
Abstract
dropout
is a very effective method in preventing
overfitting
and has become the go-to regularizer for multi-layer
neural networks
in recen
→