BriefGPT.xyz
May, 2022
使用多任务学习还是中间微调进行预训练编码器转移学习的决策
When to Use Multi-Task Learning vs Intermediate Fine-Tuning for Pre-Trained Encoder Transfer Learning
HTML
PDF
Orion Weller, Kevin Seppi, Matt Gardner
TL;DR
本文研究比较了三种自然语言处理中的迁移学习方法,即STILTs、MTL以及MTL-ALL的表现。结果显示,MTL-ALL的性能不如其他两种方法,而在目标任务比较少的情况下,性能较好的是采用pairwise MTL方法。
Abstract
transfer learning
(TL) in
natural language processing
(NLP) has seen a surge of interest in recent years, as pre-trained models have shown an impressive ability to transfer to novel tasks. Three main strategies h
→