BriefGPT.xyz
Jun, 2024
教学中的不确定性:释放目标检测知识蒸馏的潜力
Teaching with Uncertainty: Unleashing the Potential of Knowledge Distillation in Object Detection
HTML
PDF
Junfei Yi, Jianxu Mao, Tengfei Liu, Mingjie Li, Hanyu Gu...
TL;DR
提出一种基于特征的知识不确定性蒸馏范式,能够与现有的蒸馏方法无缝集成,通过蒙特卡洛dropout技术引入知识不确定性,提高学生模型对潜在知识的探索能力,并在目标检测任务中获得有效性验证。
Abstract
knowledge distillation
(KD) is a widely adopted and effective method for compressing models in
object detection
tasks. Particularly,
feature-base
→