model quantization, which aims to compress deep neural networks and
accelerate inference speed, has greatly facilitated the development of
cumbersome models on mobile and edge devices. There is a common assumption in
quantization methods from prior works that training data is available
数据无关量化方法的统计数据不均衡问题限制了其在神经网络压缩方面的应用;本文提出了 Diverse Sample Generation(DSG)方法以在分布和样本层面缓解这一问题,其可以应用于当前一些最先进的量化方法中,如 AdaRound。在大规模图像分类任务上,经过合成数据校准的模型的性能可以达到通过真实数据校准的模型的性能,甚至在一些情况下表现更好。