BriefGPT.xyz
May, 2022
对比有监督蒸馏用于连续表示学习
Contrastive Supervised Distillation for Continual Representation Learning
HTML
PDF
Tommaso Barletti, Niccolo' Biondi, Federico Pernici, Matteo Bruni, Alberto Del Bimbo
TL;DR
本文提出了一种名为对比监督蒸馏(CSD)的训练过程,用于解决连续表征学习中的灾难性遗忘问题,如何通过利用蒸馏设置中的标签信息来降低特征遗忘并学习有区别力的特征,从而使学生模型从教师模型中进行对比学习,在视觉检索任务中缓解灾难性遗忘,且表现优于当前的最新方法。
Abstract
In this paper, we propose a novel training procedure for the
continual representation learning
problem in which a
neural network model
is sequentially learned to alleviate
→