BriefGPT.xyz
Oct, 2019
为什么越大不一定越好:有限与无限神经网络
Why bigger is not always better: on finite and infinite neural networks
HTML
PDF
Laurence Aitchison
TL;DR
研究表明,无限贝叶斯神经网络缺乏表征或等效核学习能力,从而导致性能变差,而有一种新型无限网络,即瓶颈无限网络,可同时继承无限网络的理论可行性和表征学习能力。
Abstract
Recent work has shown that the outputs of convolutional
neural networks
become
gaussian process
(GP) distributed when we take the number of channels to infinity. In principle, these infinite networks should perfo
→