BriefGPT.xyz
Jun, 2021
跨越多任务学习和元学习: 迈向高效训练和有效适应
Bridging Multi-Task Learning and Meta-Learning: Towards Efficient Training and Effective Adaptation
HTML
PDF
Haoxiang Wang, Han Zhao, Bo Li
TL;DR
本论文研究了多任务学习(MTL)与基于梯度的元学习(GBML)之间的关系,通过理论和实证研究证明了它们在优化公式和学习到的预测函数上的相似性,并通过样例展示了MTL作为一阶方法可以代替计算代价高的二阶方法——GBML,在大规模数据集上训练时更加有效。
Abstract
multi-task learning
(MTL) aims to improve the generalization of several related tasks by learning them jointly. As a comparison, in addition to the joint training scheme, modern
meta-learning
allows unseen tasks
→