BriefGPT.xyz
Oct, 2019
通过难度任务实现元转移学习
Meta-Transfer Learning through Hard Tasks
HTML
PDF
Qianru Sun, Yaoyao Liu, Zhaozheng Chen, Tat-Seng Chua, Bernt Schiele
TL;DR
本文提出了一种新颖的元转移学习(MTL)方法,通过学习每个任务的深度神经网络权重的缩放和移位函数来实现权重的转移,同时引入了硬任务元批处理方案作为有效的学习课程,对三个具有挑战性的基准数据集进行了少样本学习实验,并报告了五类少样本识别任务的最高性能,验证了 MTL 方法的有效性。
Abstract
meta-learning
has been proposed as a framework to address the challenging
few-shot learning
setting. The key idea is to leverage a large number of similar few-shot tasks in order to learn how to adapt a base-lear
→