BriefGPT.xyz
Jun, 2020
知识蒸馏与自监督相遇
Knowledge Distillation Meets Self-Supervision
HTML
PDF
Guodong Xu, Ziwei Liu, Xiaoxiao Li, Chen Change Loy
TL;DR
本文介绍了一种新的知识蒸馏方法,使用自我监督信号作为辅助任务来提取自预训练教师模型中的丰富知识,并将其成功地传递到学生网络中,从而实现了在各种基准测试下的表现优异。
Abstract
knowledge distillation
, which involves extracting the "dark knowledge" from a
teacher network
to guide the learning of a
student network
,
→