BriefGPT.xyz
Jun, 2024
文本属性图上的纯Transformer预训练框架
A Pure Transformer Pretraining Framework on Text-attributed Graphs
HTML
PDF
Yu Song, Haitao Mao, Jiachen Xiao, Jingzhe Liu, Zhikai Chen...
TL;DR
图序列预训练框架 GSPT 利用统一的文本表示,在图领域中取得了显著的可转移性和实证成功。
Abstract
pretraining
plays a pivotal role in acquiring generalized knowledge from large-scale data, achieving remarkable successes as evidenced by large models in CV and NLP. However, progress in the
graph domain
remains
→