BriefGPT.xyz
Dec, 2019
无数据对抗蒸馏
Data-Free Adversarial Distillation
HTML
PDF
Gongfan Fang, Jie Song, Chengchao Shen, Xinchao Wang, Da Chen...
TL;DR
该研究提出一种新的对抗蒸馏机制,用于在没有真实数据的情况下制作紧凑的学生模型,这种数据免费的方法在分类和语义分割中表现出与基于数据驱动的方法相当的性能甚至更好。
Abstract
knowledge distillation
(KD) has made remarkable progress in the last few years and become a popular paradigm for
model compression
and knowledge transfer. However, almost all existing KD algorithms are data-drive
→