BriefGPT.xyz
May, 2019
分布式深度学习的自然压缩
Natural Compression for Distributed Deep Learning
HTML
PDF
Samuel Horvath, Chen-Yu Ho, Ludovit Horvath, Atal Narayan Sahu, Marco Canini...
TL;DR
该论文介绍了一种针对深度学习模型通信瓶颈问题的新颖压缩技术——自然压缩,该技术应用于所有更新向量的条目,并通过四舍五入到最近的2的负或正次幂的方式进行处理,从而在保证模型收敛速度不变的情况下,节省了通信成本,从而减少了训练整体运行时间。
Abstract
Due to their hunger for big data, modern
deep learning
models are trained in parallel, often in distributed environments, where communication of model updates is the bottleneck. Various
update compression
(e.g.,
→