BriefGPT.xyz
Aug, 2022
MixSKD: 图像识别中的Mixup自我知识蒸馏
MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition
HTML
PDF
Chuanguang Yang, Zhulin An, Helong Zhou, Linhang Cai, Xiang Zhi...
TL;DR
本文提出了将Self-KD应用于图像混合(MixSKD)的方法,该方法能够将原始图像和混合图像之间的特征映射和概率分布相互蒸馏,以便以跨图像的知识指导网络的学习,实验表明该方法优于其他最先进的Self-KD和数据增强方法。
Abstract
Unlike the conventional Knowledge Distillation (KD),
self-kd
allows a network to learn knowledge from itself without any guidance from extra networks. This paper proposes to perform
self-kd
from
→