BriefGPT.xyz
May, 2021
重新审视目标检测的知识蒸馏
Revisiting Knowledge Distillation for Object Detection
HTML
PDF
Amin Banitalebi-Dehkordi
TL;DR
该研究提出了一种基于伪标签学生模型训练和fine-tuning的目标检测蒸馏方法,可以实现使用未标注的数据提高模型性能,同时减少标注数据的需求,还可以用于领域自适应。实验证明该方法能够取得更好的目标检测性能。
Abstract
The existing solutions for
object detection
distillation
rely on the availability of both a teacher model and ground-truth labels. We propose a new perspective to relax this constraint. In our framework, a studen
→