BriefGPT.xyz
Oct, 2021
通过寻找平坦极小值克服渐进式少样本学习中的灾难性遗忘
Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima
HTML
PDF
Guangyuan Shi, Jiaxin Chen, Wenlong Zhang, Li-Ming Zhan, Xiao-Ming Wu
TL;DR
本研究考虑了增量式少样本学习,解决了现有方法中存在的遗忘问题,并提出在原始阶段搜索基础任务的优化解来维持模型的良好性能。实验结果表明,这一方法优于现有状态-of-the-art方法,接近近似的最优结果。
Abstract
This paper considers
incremental few-shot learning
, which requires a model to continually recognize new categories with only a few examples provided. Our study shows that existing methods severely suffer from
catastroph
→