BriefGPT.xyz
Jul, 2020
神经切向核方法的神经网络修正
A Revision of Neural Tangent Kernel-based Approaches for Neural Networks
HTML
PDF
Kyung-Su Kim, Aurélie C. Lozano, Eunho Yang
TL;DR
使用神经切比洛夫核方法,获得了网络训练误差上限、网络大小不变的泛化误差上限,以及一个简单且解析的核函数,能够优于相关网络,但需要注意网络缩放因子的问题。本文对原有方法进行修正,提出了更加严格的误差上限,解决了缩放问题。
Abstract
Recent theoretical works based on the
neural tangent kernel
(NTK) have shed light on the optimization and generalization of
over-parameterized networks
, and partially bridge the gap between their practical succes
→