BriefGPT.xyz
Aug, 2021
空间调制共现加速 DETR 收敛
Fast Convergence of DETR with Spatially Modulated Co-Attention
HTML
PDF
Peng Gao, Minghang Zheng, Xiaogang Wang, Jifeng Dai, Hongsheng Li
TL;DR
通过引入空间加权的 co-attention 机制,优化了检测元素的与自注意力计算方式,从而有效增加了 DETR 模型的训练效率。
Abstract
The recently proposed
detection transformer
(
detr
) model successfully applies Transformer to
objects detection
and achieves comparable per
→