BriefGPT.xyz
Nov, 2019
跨多种自然语言分类任务学习少样本学习
Learning to Few-Shot Learn Across Diverse Natural Language Classification Tasks
HTML
PDF
Trapit Bansal, Rishikesh Jha, Andrew McCallum
TL;DR
LEOPARD 是一种基于元学习的方法,使用少量标记的数据可优化跨任务进行学习,适用于 NLP 分类任务中的不同类数,并表现出比自监督预训练或多任务训练更好的泛化能力。
Abstract
self-supervised pre-training
of
transformer models
has shown enormous success in improving performance on a number of downstream tasks. However, fine-tuning on a new task still requires large amounts of task-spec
→