BriefGPT.xyz
Aug, 2024
长尾数据集的蒸馏
Distilling Long-tailed Datasets
HTML
PDF
Zhenghao Zhao, Haoxuan Wang, Yuzhang Shang, Kai Wang, Yan Yan
TL;DR
本研究针对长尾数据集在数据蒸馏中的困难进行分析,提出了一种新的长尾数据集蒸馏方法(LAD)。该方法通过避免直接匹配偏倚专家轨迹和联合匹配骨干网络与分类器来提升尾部类别的表现,显著改善了长尾数据集的蒸馏效果,具有重要的应用潜力。
Abstract
Dataset Distillation
(DD) aims to distill a small, information-rich dataset from a larger one for efficient neural network training. However, existing DD methods struggle with
Long-tailed Datasets
, which are prev
→