BriefGPT.xyz
Jun, 2023
忠实知识蒸馏
Faithful Knowledge Distillation
HTML
PDF
Tom A. Lamb, Rudy Brunel, Krishnamurthy, Dvijotham, M. Pawan Kumar...
TL;DR
本文探讨了知识蒸馏(KD)中的师生对相对可靠性的问题,提出了一种忠实模仿框架并提供了经验和认证方法来评估学生与其老师的相对校准,同时介绍了一种忠实蒸馏方法,其在MNIST和Fashion-MNIST数据集上的实验表明了其优越性。
Abstract
knowledge distillation
(KD) has received much attention due to its success in compressing networks to allow for their deployment in resource-constrained systems. While the problem of
adversarial robustness
has be
→