BriefGPT.xyz
Nov, 2018
数据集压缩
Dataset Distillation
HTML
PDF
Tongzhou Wang, Jun-Yan Zhu, Antonio Torralba, Alexei A. Efros
TL;DR
本文中,我们探讨了数据集蒸馏的另一种形式,即基于固定模型的数据集蒸馏,通过使用少量的数据点近似原始数据的训练模型,此方法相对于其他方法具有优势,并在多个数据集上进行了实验证明
Abstract
model distillation
aims to distill the knowledge of a complex model into a simpler one. In this paper, we consider an alternative formulation called {\em
dataset distillation
}: we keep the model fixed and instead
→