BriefGPT.xyz
Aug, 2024
CLIP-CID:通过集群-实例区分实现高效的CLIP蒸馏
CLIP-CID: Efficient CLIP Distillation via Cluster-Instance Discrimination
HTML
PDF
Kaicheng Yang, Tiancheng Gu, Xiang An, Haiqiang Jiang, Xiangzi Dai...
TL;DR
本研究解决了CLIP模型预训练数据量大导致计算资源消耗的问题,提出了一种名为CLIP-CID的新型蒸馏机制。此方法通过图像语义平衡和集群-实例区分,有效提升了知识转移效率,并在多个下游任务中达到了最先进的性能。
Abstract
Contrastive Language-Image Pre-training (
CLIP
) has achieved excellent performance over a wide range of tasks. However, the effectiveness of
CLIP
heavily relies on a substantial corpus of pre-training data, result
→