BriefGPT.xyz
Apr, 2019
M2KD: 多模型、多层次知识蒸馏用于增量学习
M2KD: Multi-model and Multi-level Knowledge Distillation for Incremental Learning
HTML
PDF
Peng Zhou, Long Mai, Jianming Zhang, Ning Xu, Zuxuan Wu...
TL;DR
本文提出一种多模型和多级别知识蒸馏策略,通过直接利用之前的模型快照和辅助蒸馏等方式,在保留旧类知识的同时提升整体性能,有效解决了旧类性能下降的问题。
Abstract
incremental learning
targets at achieving good performance on new categories without forgetting old ones.
knowledge distillation
has been shown critical in preserving the performance on old classes. Conventional
→