BriefGPT.xyz
Oct, 2023
通过根子树调整图中的自注意力
Tailoring Self-Attention for Graph via Rooted Subtrees
HTML
PDF
Siyuan Huang, Yunchong Song, Jiayue Zhou, Zhouhan Lin
TL;DR
本文介绍了一种名为Subtree Attention (STA)的新型多跳图注意力机制,它用于解决局部注意力和全局注意力之间的问题,并通过有效的形式实现了线性时间复杂度。在十个节点分类数据集上的全面评估表明,基于STA的模型优于现有的图转换器和主流图神经网络。
Abstract
attention mechanisms
have made significant strides in
graph learning
, yet they still exhibit notable limitations: local attention faces challenges in capturing long-range information due to the inherent problems
→