BriefGPT.xyz
Oct, 2023
自然、稳健和灾难性过拟合中的过度记忆
On the Over-Memorization During Natural, Robust and Catastrophic Overfitting
HTML
PDF
Runqi Lin, Chaojian Yu, Bo Han, Tongliang Liu
TL;DR
通过探索不同类型的过拟合,本研究聚焦于自然模式,发现深度神经网络中的过度记忆现象并提出了一种名为“干扰过度记忆”的框架,通过移除或增加高置信度的自然模式来综合地减轻不同类型的过拟合,实验证明该方法在各种训练范式中有效。
Abstract
overfitting
negatively impacts the
generalization ability
of
deep neural networks
(DNNs) in both natural and adversarial training. Existin
→