BriefGPT.xyz
Sep, 2021
Few-shot意图分类的预训练有效性
Effectiveness of Pre-training for Few-shot Intent Classification
HTML
PDF
Haode Zhang, Yuwei Zhang, Li-Ming Zhan, Jiaxin Chen, Guangyuan Shi...
TL;DR
本文研究了少样本意图分类预训练的有效性,发现仅使用公共数据集上的少量标记数据对BERT进行微调即可高效地生成一个预训练模型IntentBERT,其性能超过现有预训练模型,具有高泛化能力。
Abstract
This paper investigates the effectiveness of
pre-training
for
few-shot intent classification
. While existing paradigms commonly further pre-train language models such as
→