dataset distillation, a pragmatic approach in machine learning, aims to
create a smaller synthetic dataset from a larger existing dataset. However,
existing distillation methods primarily adopt a model-based para
Dataset Distillation technique using learned prior of deep generative models and a new optimization algorithm improves cross-architecture generalization by synthesizing few synthetic images from a large dataset.