BriefGPT.xyz
Sep, 2017
通过条件对抗网络进行知识蒸馏,训练浅层和瘦身网络以实现加速
Learning Loss for Knowledge Distillation with Conditional Adversarial Networks
HTML
PDF
Zheng Xu, Yen-Chang Hsu, Jiawei Huang
TL;DR
本文介绍了使用条件对抗网络来学习从大型精准教师神经网络中获取知识来训练小型快速学生神经网络的学生-教师策略,并研究了网络大小对于分类准确度和推理时间之间的平衡的影响和建议。
Abstract
There is an increasing interest on accelerating
neural networks
for real-time applications. We study the
student-teacher strategy
, in which a small and fast student network is trained with the auxiliary informati
→