BriefGPT.xyz
Nov, 2020
通道级知识蒸馏用于密集预测
Channel-wise Distillation for Semantic Segmentation
HTML
PDF
Changyong Shu, Yifan Liu, Jianfei Gao, Lin Xu, Chunhua Shen
TL;DR
利用KL散度和频道-wise蒸馏进行语义分割,以更小的计算成本,优于目前所有已知的空间蒸馏方法,可视为有效的知识提取方法。
Abstract
knowledge distillation
(KD) has been proven to be a simple and effective tool for training compact models. Almost all KD variants for
semantic segmentation
align the student and teacher networks' feature maps in
→