BriefGPT.xyz
May, 2023
具有分区注意力的双路径Transformer
Dual Path Transformer with Partition Attention
HTML
PDF
Zhengkai Jiang, Liang Liu, Jiangning Zhang, Yabiao Wang, Mingang Chen...
TL;DR
本文介绍了一种新颖的双重注意机制,包括由卷积神经网络生成的局部注意和由Vision Transformer生成的长程注意,提出了一种新的多头分区关注机制(MHPA)来解决计算复杂性和内存占用的问题,并基于此提出了一个分层视觉骨干网络DualFormer,在多个计算机视觉任务中都取得了比较好的表现。
Abstract
This paper introduces a novel attention mechanism, called
dual attention
, which is both efficient and effective. The
dual attention
mechanism consists of two parallel components: local attention generated by
→