BriefGPT.xyz
Oct, 2021
基于网络扩充的轻量化深度学习
Network Augmentation for Tiny Deep Learning
HTML
PDF
Han Cai, Chuang Gan, Ji Lin, Song Han
TL;DR
本研究旨在提出一种名为“网络增益”的训练方法,用于改善微型神经网络的性能,该方法采用“反向dropout”增强神经网络,将小模型放在大型模型中并鼓励其作为子模型工作,最终在图像分类和物体检测等任务中改善了模型性能。
Abstract
We introduce
network augmentation
(NetAug), a new training method for improving the performance of
tiny neural networks
. Existing regularization techniques (e.g., data augmentation, dropout) have shown much succe
→