unsupervised graph representation learning(GRL) aims to distill diverse graph
information into task-agnostic embeddings without label supervision. Due to a
lack of support from labels, recent representation learning methods usually
adopt self-supervised learning, and embeddings are lea
Graph self-supervised learning can be improved by leveraging multiple pretext tasks through the AutoSSL framework, which uses homophily as guidance and significantly boosts the performance on downstream tasks.