BriefGPT.xyz
Feb, 2024
图形遮罩注意力即可
Masked Attention is All You Need for Graphs
HTML
PDF
David Buterez, Jon Paul Janet, Dino Oglic, Pietro Lio
TL;DR
提出了一种基于注意力机制的学习图表的简单替代方法,被称为图形掩码注意力(MAG),在长距离任务上具有最先进的性能,并在超过55个节点和图级任务中优于强大的信息传递基线和复杂的注意力方法。与图神经网络相比,显示了显着更好的迁移学习能力,并具有与节点或边数目的次线性内存缩放相比较的相当或更好的时间和内存缩放性能。
Abstract
graph neural networks
(GNNs) and variations of the
message passing algorithm
are the predominant means for learning on graphs, largely due to their flexibility, speed, and satisfactory performance. The design of
→