BriefGPT.xyz
Jun, 2021
CAT: 视觉Transformer中的交叉注意力
CAT: Cross Attention in Vision Transformer
HTML
PDF
Hezheng Lin, Xing Cheng, Xiangyu Wu, Fan Yang, Dong Shen...
TL;DR
探索使用Cross Attention机制替代传统的self-attention机制在计算机视觉任务中实现Transformer的实用性,实验表明该机制在ImageNet-1K、COCO和ADE20K等任务上可以达到先进水平,并且降低了计算量。
Abstract
Since
transformer
has found widespread use in NLP, the potential of
transformer
in
cv
has been realized and has inspired many new approach
→