BriefGPT.xyz
Mar, 2024
通过稀疏插值专家释放元调优的强大力量,以实现少样本泛化
Unleashing the Power of Meta-tuning for Few-shot Generalization Through Sparse Interpolated Experts
HTML
PDF
Shengzhuang Chen, Jihoon Tack, Yunqiao Yang, Yee Whye Teh, Jonathan Richard Schwarz...
TL;DR
通过稀疏化混合专家方法,稀疏元调优成功地提高了视觉基础模型的迁移能力,并在零阶和基于梯度的适应环境中建立了新的最先进的结果。
Abstract
Conventional wisdom suggests
parameter-efficient fine-tuning
of foundation models as the state-of-the-art method for
transfer learning
in vision, replacing the rich literature of alternatives such as meta-learnin
→