BriefGPT.xyz
Feb, 2024
贝叶斯多任务转移学习用于软提示调优
Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning
HTML
PDF
Haeju Lee, Minchan Jeong, Se-Young Yun, Kee-Eung Kim
TL;DR
我们提出了一种贝叶斯多任务迁移学习方法,通过后验分布的样本获得代表性的源提示,并将其聚合以构成初始目标提示,无需辅助模型,实现高度参数效率。
Abstract
prompt tuning
, in which prompts are optimized to adapt
large-scale pre-trained language models
to downstream tasks instead of fine-tuning the full model parameters, has been shown to be particularly effective whe
→