BriefGPT.xyz
Dec, 2020
渐进式网络嫁接用于少样本知识蒸馏
Progressive Network Grafting for Few-Shot Knowledge Distillation
HTML
PDF
Chengchao Shen, Xinchao Wang, Youtan Yin, Jie Song, Sihui Luo...
TL;DR
本文介绍一种用于少样本知识蒸馏的双阶段方法,该方法仅使用了少量无标注样本,能够有效降低深度神经网络的模型复杂度和大小,实现了与传统方法相当的性能表现。
Abstract
knowledge distillation
has demonstrated encouraging performances in deep
model compression
. Most existing approaches, however, require massive labeled data to accomplish the knowledge transfer, making the
→