BriefGPT.xyz
Jun, 2024
CorrMAE:预训练具有遮盖自编码器的对应变换器
CorrMAE: Pre-training Correspondence Transformers with Masked Autoencoder
HTML
PDF
Tangfei Liao, Xiaoqin Zhang, Guobao Xiao, Min Li, Tao Wang...
TL;DR
我们提出了一种预训练方法,通过重构遮蔽通信并提供强大的初始表示,以获取普适的内点一致性表示,从而在后续任务中取得显著的改进。
Abstract
pre-training
has emerged as a simple yet powerful methodology for
representation learning
across various domains. However, due to the expensive training cost and limited data,
→