BriefGPT.xyz
Nov, 2017
通过知识蒸馏技术提高低精度网络的准确度
Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy
HTML
PDF
Asit Mishra, Debbie Marr
TL;DR
本文介绍了一种组合使用低精度计算和蒸馏知识来提高深度学习网络性能的方法,该方法取得了 ImageNet 数据集上 ResNet 架构各种变体的三元精度和 4 位精度的准确率的最新成果,并提供了三种应用蒸馏知识技术到训练和部署流程中的方案。
Abstract
deep learning
networks have achieved state-of-the-art accuracies on computer vision workloads like image classification and
object detection
. The performant systems, however, typically involve big models with num
→