BriefGPT.xyz
Sep, 2021
PPT: 预训练提示调整用于少样本学习
PPT: Pre-trained Prompt Tuning for Few-shot Learning
HTML
PDF
Yuxian Gu, Xu Han, Zhiyuan Liu, Minlie Huang
TL;DR
本文提出了一种名为PPT的框架,通过在预训练阶段添加软提示来获得更好的初始化,将预训练提示调整用于下游任务可达到或甚至优于整体微调的效果,这对于实际使用大规模预训练语言模型是一种有效和高效的方法。
Abstract
Prompts for
pre-trained language models
(PLMs) have shown remarkable performance by bridging the gap between pre-training tasks and various
downstream tasks
. Among these methods,
→