BriefGPT.xyz
Jun, 2023
EMO: 几次元元学习的史诗式记忆优化
EMO: Episodic Memory Optimization for Few-Shot Meta-Learning
HTML
PDF
Yingjun Du, Jiayi Shen, Xiantong Zhen, Cee G. M. Snoek
TL;DR
本文提出一种叫作EMO的基于外部内存的元学习优化器,通过学习将过去的训练任务的学习过程保留和召回,在梯度提供信息有限时,使参数更新成功收敛,从而实现少样本元学习的优化问题。
Abstract
few-shot meta-learning
presents a challenge for gradient descent optimization due to the limited number of training samples per task. To address this issue, we propose an
episodic memory optimization
for meta-lea
→