BriefGPT.xyz
Sep, 2024
数据高效生成用于数据集蒸馏
Data-Efficient Generation for Dataset Distillation
HTML
PDF
Zhe Li, Weitong Zhang, Sarah Cechnicka, Bernhard Kainz
TL;DR
本研究解决了深度学习在图像任务中面临的数据存储和计算成本过高的问题。通过训练一个类条件潜在扩散模型,生成可读的合成图像,显著提升了数据集性能并减少了蒸馏时间。该方法在ECCV 2024的首个数据集蒸馏挑战中,CIFAR100和TinyImageNet数据集上取得了第一名的佳绩。
Abstract
While
Deep Learning
techniques have proven successful in image-related tasks, the exponentially increased data storage and computation costs become a significant challenge.
Dataset Distillation
addresses these ch
→