BriefGPT.xyz
Jun, 2024
大规模预训练先验指导的领域泛化
Domain Generalization Guided by Large-Scale Pre-Trained Priors
HTML
PDF
Zongbin Wang, Bin Pan, Shiyu Shen, Tianyang Shi, Zhenwei Shi
TL;DR
通过在领域一般化算法的微调过程中利用预训练模型,我们提出了一种新的微调方法,该方法称为“用大规模预训练先验进行微调(FT-LP)”,在多个数据集和领域一般化模型上的实验证明了其显著改进和有效性。
Abstract
domain generalization
(DG) aims to train a model from limited source domains, allowing it to generalize to unknown target domains. Typically, DG models only employ large-scale
pre-trained models
during the initia
→