BriefGPT.xyz
Mar, 2020
通过自适应实例标准化进行知识蒸馏
Knowledge distillation via adaptive instance normalization
HTML
PDF
Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos
TL;DR
本文提出了一种新的知识蒸馏方法,基于转移来自教师到学生的通道均值和方差等特征统计信息,以及新的适应性实例归一化损失,以提高模型压缩效果。
Abstract
This paper addresses the problem of
model compression
via
knowledge distillation
. To this end, we propose a new
knowledge distillation
met
→