BriefGPT.xyz
Mar, 2020
量化知识以解释知识蒸馏
Explaining Knowledge Distillation by Quantifying the Knowledge
HTML
PDF
Xu Cheng, Zhefan Rao, Yilan Chen, Quanshi Zhang
TL;DR
该论文提出了一种通过量化和分析深度神经网络(DNN)中间层编码的与任务相关和任务无关的视觉概念来解释知识蒸馏成功的方法。作者设计了三种数学度量来评估 DNN 的特征表示。在实验中,对各种 DNN 进行了诊断,并验证了上述假设。
Abstract
This paper presents a method to interpret the success of
knowledge distillation
by quantifying and analyzing
task-relevant
and
task-irrelevant
→