BriefGPT.xyz
May, 2022
知识蒸馏的映射仿真
Mapping Emulation for Knowledge Distillation
HTML
PDF
Jing Ma, Xiang Xiang, Zihan Zhang, Yuwen Tan, Yiming Wan...
TL;DR
通过新的几何视角将 source-blind knowledge distillation 问题视为老师和学生生成的分布对齐,提出 MEKD 结构通过生成对抗训练来模拟反向映射,并使用普适函数逼近和最优质量传输理论的理论保证,该方法在各种基准测试中优于现有的 source-blind KD 方法。
Abstract
This paper formalizes the
source-blind knowledge distillation
problem that is essential to
federated learning
. A new geometric perspective is presented to view such a problem as aligning generated distributions b
→