TL;DR通过研究神经网络和核脊回归的大维行为,我们确定了核脊回归的泛化误差的精确阶数,并展示了其随 s 的不同取值的曲线如何演化,还发现了饱和效应的存在。
Abstract
Motivated by the studies of neural networks (e.g.,the neural tangent kernel
theory), we perform a study on the large-dimensional behavior of kernel ridge
regression (KRR) where the sample size $n \asymp d^{\gamma