BriefGPT.xyz
Apr, 2021
数据无关模型压缩的双判别器对抗蒸馏
Dual Discriminator Adversarial Distillation for Data-free Model Compression
HTML
PDF
Haoran Zhao, Xin Sun, Junyu Dong, Hui Yu, Huiyu Zhou
TL;DR
提出了一种名为Dual Discriminator Adversarial Distillation (DDAD)的新型无数据的知识蒸馏方法,通过生成样本,训练紧凑的学生网络,使其接近其教师网络,从而在计算机视觉任务中实现了高效的神经网络。
Abstract
knowledge distillation
has been widely used to produce portable and efficient
neural networks
which can be well applied on edge devices for
compu
→