BriefGPT.xyz
Feb, 2024
双核重任:通过语义和拓扑意识增强图稀疏训练
Two Heads Are Better Than One: Boosting Graph Sparse Training via Semantic and Topological Awareness
HTML
PDF
Guibin Zhang, Yanwei Yue, Kun Wang, Junfeng Fang, Yongduo Sui...
TL;DR
图稀疏训练(GST)提出了一种动态调整数据层稀疏度的方法,通过Equilibria Sparsification Principle来实现拓扑和语义信息的平衡,从而产生一个具有最大拓扑完整性且没有性能下降的稀疏图。
Abstract
graph neural networks
(GNNs) excel in various graph learning tasks but face
computational challenges
when applied to large-scale graphs. A promising solution is to remove non-essential edges to reduce the computa
→