Researchers have long tried to minimize training costs in deep learning while maintaining strong generalization across diverse datasets. Emerging research on dataset distillation aims to reduce →
Dataset Distillation technique using learned prior of deep generative models and a new optimization algorithm improves cross-architecture generalization by synthesizing few synthetic images from a large dataset.