BriefGPT.xyz
Oct, 2020
CompRess: 通过压缩表示进行自监督学习
CompRess: Self-Supervised Learning by Compressing Representations
HTML
PDF
Soroush Abbasi Koohpayegani, Ajinkya Tejankar, Hamed Pirsiavash
TL;DR
本研究展示了教师-学生模型压缩的有效性,通过将已经学习的大型深度自监督模型压缩到较小的模型,使得学生模型具有和教师模型相似的数据点嵌入空间,最终在ImageNet分类任务上获得了比监督学习方法更优秀的结果。
Abstract
self-supervised learning
aims to learn good representations with unlabeled data. Recent works have shown that larger models benefit more from
self-supervised learning
than smaller models. As a result, the gap bet
→