BriefGPT.xyz
Apr, 2024
MTKD:图像超分辨率的多教师知识蒸馏
MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution
HTML
PDF
Yuxuan Jiang, Chen Feng, Fan Zhang, David Bull
TL;DR
我们提出了一种新颖的多教师知识蒸馏(MTKD)框架,专门用于图像超分辨率,通过结合和增强多个教师模型的输出来指导紧凑的学生网络的学习过程,并通过在空间和频率域中观察差异来优化训练过程,从而在超分辨率性能上实现了明显的改善。
Abstract
knowledge distillation
(KD) has emerged as a promising technique in
deep learning
, typically employed to enhance a compact student network through learning from their high-performance but more complex teacher var
→