Oct, 2020

综合关注自我蒸馏用于弱监督物体检测

TL;DRComprehensive Attention Self-Distillation (CASD) is a new training approach for Weakly Supervised Object Detection (WSOD) that enables consistent object detection across different transformations of the same images by computing comprehensive attention and conducting self-distillation on the WSOD networks.