BriefGPT.xyz
Aug, 2023
NormKD:标准化的知识蒸馏
NormKD: Normalized Logits for Knowledge Distillation
HTML
PDF
Zhihao Chi, Tu Zheng, Hengjia Li, Zheng Yang, Boxi Wu...
TL;DR
本文提出了一种基于标准化的知识蒸馏方法(NormKD),通过自定义每个样本的温度来提高知识蒸馏的效果,并在图像分类的任务中表现出明显的优越性。此外,NormKD可轻松应用于其他基于logit的方法,并达到接近或甚至超越基于特征的方法的性能。
Abstract
logit based knowledge distillation
gets less attention in recent years since feature based methods perform better in most cases. Nevertheless, we find it still has untapped potential when we re-investigate the
temperatu
→