Zhiyuan Li, Ruosong Wang, Dingli Yu, Simon S. Du, Wei Hu...
TL;DR该研究使用新操作Local Average Pooling (LAP)修正了CNN-GP和CNTK并采用Coates等人提出的图像预处理技术,成功将在CIFAR-10数据集上的分类准确率提升至89%,具有与AlexNet相当的表现。
Abstract
Recent research shows that for training with $\ell_2$ loss, convolutional neural networks (CNNs) whose width (number of channels in convolutional layers) goes to infinity correspond to regression with respect to the CNN Gaussian Process kernel (cnn-gp) if only the last layer is trained