BriefGPT.xyz
Mar, 2024
通过曲率正则化实现对抗鲁棒的数据集精炼
Towards Adversarially Robust Dataset Distillation by Curvature Regularization
HTML
PDF
Eric Xue, Yijiang Li, Haoyang Liu, Yifan Shen, Haohan Wang
TL;DR
研究提出了一种新方法,通过在蒸馏过程中加入曲率正则化,使得在生成的数据集上训练的模型在保持高准确性的同时获得更好的敌对鲁棒性,而且计算开销更小。实验证明该方法在准确性和鲁棒性方面都优于标准的敌对训练,并能生成经受住各种敌对攻击的鲁棒蒸馏数据集。
Abstract
dataset distillation
(DD) allows datasets to be distilled to fractions of their original size while preserving the rich distributional information so that models trained on the distilled datasets can achieve a comparable
→