BriefGPT.xyz
Apr, 2024
关于自适应方法在连续学习中的收敛性
On the Convergence of Continual Learning with Adaptive Methods
HTML
PDF
Seungyub Han, Yeongmo Kim, Taehyun Cho, Jungwoo Lee
TL;DR
该论文介绍了对于连续学习中的收敛性分析,提出了一种适应性方法用于非凸连续学习,该方法调整先前和当前任务的梯度的步长,以达到与 SGD 方法相同的收敛速度,并且在减轻每次迭代的灾难性遗忘项的情况下,改进了连续学习在几个图像分类任务中的性能。
Abstract
One of the objectives of
continual learning
is to prevent
catastrophic forgetting
in learning multiple tasks sequentially, and the existing solutions have been driven by the conceptualization of the plasticity-st
→