BriefGPT.xyz
Jun, 2022
深度学习中NTK在理解泛化方面的局限性
Limitations of the NTK for Understanding Generalization in Deep Learning
HTML
PDF
Nikhil Vyas, Yamini Bansal, Preetum Nakkiran
TL;DR
本文通过缩放定律的角度研究神经切向核(NTK)及其经验性变量,发现它们无法完全解释神经网络泛化的重要方面。通过实际设置,我们展示了有限宽度神经网络相对于其对应的经验和无穷NTK起始时具有显着更好的数据缩放指数,并证明了NTK方法在理解自然数据集上真实网络泛化的局限性。
Abstract
The ``
neural tangent kernel
'' (NTK) (Jacot et al 2018), and its empirical variants have been proposed as a proxy to capture certain behaviors of real neural networks. In this work, we study NTKs through the lens of
scal
→