BriefGPT.xyz
Nov, 2024
减缓持续学习中的遗忘
Slowing Down Forgetting in Continual Learning
HTML
PDF
Pascal Janetzky, Tobias Schlagenhauf, Stefan Feuerriegel
TL;DR
本研究针对持续学习中的灾难性遗忘问题,提出了一种名为ReCL的新框架,旨在减缓遗忘现象。该框架利用梯度基础神经网络的隐含偏差,重建旧数据并与当前训练数据结合,从而提高模型的性能。实验表明,ReCL在多种持续学习场景和不同数据集上显著提升了性能,是首个通过利用模型自我作为记忆缓冲区来解决灾难性遗忘问题的方法。
Abstract
A common challenge in
Continual Learning
(CL) is
Catastrophic Forgetting
, where the performance on old tasks drops after new, additional tasks are learned. In this paper, we propose a novel framework called ReCL
→