BriefGPT.xyz
Mar, 2021
从少量且带有噪声的数据中快速持续学习
Learning to Continually Learn Rapidly from Few and Noisy Data
HTML
PDF
Nicholas I-Hsien Kuo, Mehrtash Harandi, Nicolas Fourrier, Christian Walder, Gabriela Ferraro...
TL;DR
本文研究神经网络存在的遗忘问题以及连续学习的解决方案,通过重放机制和元学习的结合,发现引入元学习可以解决传统重放机制在每个任务分配的内存有限时容易导致失效的问题,并在保证学习效率和准确性方面具有优势。
Abstract
neural networks
suffer from
catastrophic forgetting
and are unable to sequentially learn new tasks without guaranteed stationarity in data distribution.
→