BriefGPT.xyz
Nov, 2020
通过降低速率进行增量学习
Incremental Learning via Rate Reduction
HTML
PDF
Ziyang Wu, Christina Baek, Chong You, Yi Ma
TL;DR
本研究利用基于速率降低的另一原则提出了白盒架构,该架构明确计算网络的每个层次而无需反向传播,以解决当前深度学习体系结构遇到的遗忘问题,实验证明该方法能够构建一个新网络以模拟所有过去和新的数据类的联合训练,从而有效地解决了持续学习中遗忘问题。
Abstract
Current
deep learning
architectures suffer from
catastrophic forgetting
, a failure to retain knowledge of previously learned classes when incrementally trained on new classes. The fundamental roadblock faced by <
→