BriefGPT.xyz
Sep, 2019
元学习先验在神经网络中减缓灾难性遗忘
Meta-learnt priors slow down catastrophic forgetting in neural networks
HTML
PDF
Giacomo Spigler
TL;DR
本文提出了一种基于元学习的训练方式,通过序贯地训练神经网络来缓解深度学习中存在的灾难性遗忘问题。在此基础上,提出了SeqFOMAML算法,并在Omniglot和MiniImageNet上的分类任务数据集上进行了验证。
Abstract
Current
training
regimes for
deep learning
usually involve exposure to a single task / dataset at a time. Here we start from the observation that in this context the trained model is not given any knowledge of an
→