BriefGPT.xyz
Jul, 2021
使用无限宽的卷积神经网络进行数据集蒸馏
Dataset Distillation with Infinitely Wide Convolutional Networks
HTML
PDF
Timothy Nguyen, Roman Novak, Lechao Xiao, Jaehoon Lee
TL;DR
采用一种新的分布式基于核的元学习框架,使用无限宽的卷积神经网络,在数据集压缩中实现前沿的结果,通过对MNIST,Fashion-MNIST,CIFAR-10,CIFAR-100和SVHN等多个数据集的数据压缩进行初步分析,为数据如何与自然发生的数据不同提供了一些启示。
Abstract
The effectiveness of
machine learning
algorithms arises from being able to extract useful features from large amounts of data. As model and dataset sizes increase,
dataset distillation
methods that compress large
→