BriefGPT.xyz
Mar, 2024
输入维度适中的带有泄漏ReLU的网络中的良性过拟合
Benign overfitting in leaky ReLU networks with moderate input dimension
HTML
PDF
Kedar Karhadkar, Erin George, Michael Murray, Guido Montúfar, Deanna Needell
TL;DR
该研究针对二元分类任务,使用带有折线损失的两层泄漏整流线性单元网络,研究了良性过拟合问题,通过对模型参数的信号噪声比进行特征化,发现高信噪比出现良性过拟合,低信噪比出现有害过拟合,并将良性和非良性过拟合归因于近似边际最大化特性,同时降低了训练数据的正交性要求。
Abstract
The problem of
benign overfitting
asks whether it is possible for a model to perfectly fit noisy training data and still generalize well. We study
benign overfitting
in two-layer
→