BriefGPT.xyz
Jun, 2020
自我知识蒸馏与渐进式目标细化
Self-Knowledge Distillation: A Simple Way for Better Generalization
HTML
PDF
Kyungyul Kim, ByeongMoon Ji, Doyoung Yoon, Sangheum Hwang
TL;DR
该文介绍了一种称为渐进式自我知识蒸馏的有效规则化方法,适用于任何具有硬目标的监督学习任务,可以提高模型的泛化性能和置信度预测,并取得了优于基准的实验结果。
Abstract
The generalization capability of
deep neural networks
has been substantially improved by applying a wide spectrum of
regularization methods
, e.g., restricting function space, injecting randomness during training,
→