BriefGPT.xyz
Oct, 2023
利用对抗扰动的OpenGL着色器图像进行无数据知识蒸馏
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images
HTML
PDF
Logan Frank, Jim Davis
TL;DR
无数据的知识蒸馏(KD)通过使用OpenGL图像和数据增强等方法来训练学生网络,达到了多个数据集/网络的最新结果,并且比现有的生成器基于无数据KD方法更为稳定。
Abstract
knowledge distillation
(KD) has been a popular and effective method for model compression. One important assumption of KD is that the original training dataset is always available. However, this is not always the case due to
→