BriefGPT.xyz
Dec, 2022
预训练语言模型可完全零样本学习
Pre-trained Language Models can be Fully Zero-Shot Learners
HTML
PDF
Xuandong Zhao, Siqi Ouyang, Zhiguo Yu, Ming Wu, Lei Li
TL;DR
本文提出了一种基于预训练语言模型的无参考学习方法NPPrompt,能够有效扩展一个预训练模型到多个语言理解任务,且不需要标注数据或附加未标注语料库进行微调。实验结果表明,NPPrompt 在文本分类和 GLUE 基准测试等任务中具有比以前最好的全零样本方法更高的绝对增益。
Abstract
How can we extend a pre-trained model to many language understanding tasks, without labeled or additional unlabeled data?
pre-trained language models
(PLMs) have been effective for a wide range of
nlp tasks
. Howe
→