BriefGPT.xyz
Jul, 2022
持续学习中的一致性是进一步减轻灾难性遗忘的关键
Consistency is the key to further mitigating catastrophic forgetting in continual learning
HTML
PDF
Prashant Bhat, Bahram Zonooz, Elahe Arani
TL;DR
通过在Experience Replay框架中引入一致性正则化方法,将其作为自监督前提任务,并在各种连续学习场景下进行研究。结果表明,相对严格的一致性约束可以更好地保留以前任务的信息。
Abstract
deep neural networks
struggle to continually learn multiple sequential tasks due to
catastrophic forgetting
of previously learned tasks. Rehearsal-based methods which explicitly store previous task samples in the
→