BriefGPT.xyz
May, 2023
提示调整的普适性和限制性
Universality and Limitations of Prompt Tuning
HTML
PDF
Yihan Wang, Jatin Chauhan, Wei Wang, Cho-Jui Hsieh
TL;DR
研究预训练语言模型的prompt tuning,从通用性和有限深度固定权重的预训练transformers的限制方面分析了prompt tuning的作用,证明了prompt tuning在有限深度transformers中存在限制,并给出了所需的可调prompt参数的下限。
Abstract
Despite the demonstrated empirical efficacy of
prompt tuning
to adapt a
pretrained language model
for a new task, the theoretical underpinnings of the difference between "tuning parameters before the input" again
→