BriefGPT.xyz
Jul, 2023
简单无参数的自注意力近似
Simple parameter-free self-attention approximation
HTML
PDF
Yuwen Zhai, Jing Hao, Liang Gao, Xinyu Li, Yiping Gao...
TL;DR
我们提出了一种不需要训练参数的自注意力近似方法SPSA,该方法具有线性复杂度,结合卷积捕获全局空间特征,并在图像分类和目标检测任务中进行了大量实验验证其有效性。
Abstract
The
hybrid model
of
self-attention and convolution
is one of the methods to lighten ViT. The quadratic computational complexity of self-attention with respect to token length limits the efficiency of ViT on edge
→