BriefGPT.xyz
Ask
alpha
关键词
comprehensive attention self-distillation
搜索结果 - 1
综合关注自我蒸馏用于弱监督物体检测
Comprehensive Attention Self-Distillation (CASD) is a new training approach for Weakly Supervised Object Detection (WSOD
→
PDF
4 years ago
Prev
Next