BriefGPT.xyz
Jun, 2024
小规模无数据知识蒸馏
Small Scale Data-Free Knowledge Distillation
HTML
PDF
He Liu, Yikai Wang, Huaping Liu, Fuchun Sun, Anbang Yao
TL;DR
通过使用小规模逆置数据进行知识蒸馏,提高训练效率的数据无关知识蒸馏(SSD-KD)方法在图像分类和语义分割基准测试中展示了超强的性能和高效的训练。
Abstract
data-free
knowledge distillation
is able to utilize the knowledge learned by a large teacher network to augment the training of a smaller student network without accessing the original training data, avoiding pri
→