BriefGPT.xyz
May, 2022
强制提示下的无数据知识蒸馏提升
Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt
HTML
PDF
Xinyin Ma, Xinchao Wang, Gongfan Fang, Yongliang Shen, Weiming Lu
TL;DR
该研究提出PromptDFD,一种基于提示的数据无关知识蒸馏法,利用一个预训练的生成模型提供语言先验知识,进一步提高数据合成的质量,并在蒸馏性能上取得了显著的改进。
Abstract
data-free knowledge distillation
(DFKD) conducts knowledge distillation via eliminating the dependence of original training data, and has recently achieved impressive results in accelerating
pre-trained language models<
→