BriefGPT.xyz
Feb, 2020
关于分布式学习的偏压压缩研究
On Biased Compression for Distributed Learning
HTML
PDF
Aleksandr Beznosikov, Samuel Horváth, Peter Richtárik, Mher Safaryan
TL;DR
研究表明,用于分布式学习的偏置压缩算子可以显著提高通信效率并达到线性收敛率,其性能优于其无偏压缩器。它们可用于随机梯度下降和分布式随机梯度下降,并且存在许多具有良好理论保证和实际性能的新偏置压缩器可供选择。
Abstract
In the last few years, various
communication compression
techniques have emerged as an indispensable tool helping to alleviate the communication bottleneck in
distributed learning
. However, despite the fact {\em
→