BriefGPT.xyz
Mar, 2020
对抗性持续学习
Adversarial Continual Learning
HTML
PDF
Sayna Ebrahimi, Franziska Meier, Roberto Calandra, Trevor Darrell, Marcus Rohrbach
TL;DR
本研究旨在使用混合方法来解决连续学习中遗忘的问题,该方法结合了体系结构生长来防止任务特定技能的遗忘,并采用经验回放方法来保存共享技能。研究结果表明,我们的混合方法在单个数据集和多个数据集的类增量学习中都表现优异。
Abstract
continual learning
aims to learn new tasks without forgetting previously learned ones. We hypothesize that representations learned to solve each task in a sequence have a
shared structure
while containing some ta
→