BriefGPT.xyz
Jun, 2024
变通信速率下的大规模图神经网络分布式训练
Distributed Training of Large Graph Neural Networks with Variable Communication Rates
HTML
PDF
Juan Cervino, Md Asadullah Turja, Hesham Mostafa, Nageen Himayat, Alejandro Ribeiro
TL;DR
在分布式图神经网络训练中引入了一种变化的压缩方案,用于减少通信量而不降低学习模型的准确性,并通过理论分析和实证结果证明了其性能优于完全通信情况下的压缩比。
Abstract
Training
graph neural networks
(GNNs) on large graphs presents unique challenges due to the large memory and computing requirements.
distributed gnn training
, where the graph is partitioned across multiple machin
→