BriefGPT.xyz
Oct, 2017
元学习与普适性:深度表示和梯度下降可以逼近任何学习算法
Meta-Learning and Universality: Deep Representations and Gradient Descent can Approximate any Learning Algorithm
HTML
PDF
Chelsea Finn, Sergey Levine
TL;DR
本文从普适性的角度出发,比较递归模型与将梯度下降融入元学习者的最新方法的表达能力,并回答了梯度下降与深度表示是否具有近似任何学习算法的足够能力的问题。结果发现,基于梯度的元学习策略与递归模型相比具有更广泛的推广性。
Abstract
Learning to learn is a powerful paradigm for enabling models to learn from data more effectively and efficiently. A popular approach to
meta-learning
is to train a
recurrent model
to read in a training dataset as
→