BriefGPT.xyz
May, 2024
前向-后向知识蒸馏的持续聚类
Forward-Backward Knowledge Distillation for Continual Clustering
HTML
PDF
Mohammadreza Sadeghi, Zihan Wang, Narges Armanfard
TL;DR
在无监督连续聚类(UCC)中,引入了前向-后向知识蒸馏(FBCC)的概念,以解决连续学习中的灾难性遗忘问题,通过使用单个连续学习器和多个学生模型来改善聚类的性能和内存效率。
Abstract
unsupervised continual learning
(UCL) is a burgeoning field in machine learning, focusing on enabling neural networks to sequentially learn tasks without explicit label information.
catastrophic forgetting
(CF),
→