BriefGPT.xyz
Feb, 2020
LASG: 惰性累加随机梯度优化方法用于高效通讯的分布式学习
LASG: Lazily Aggregated Stochastic Gradients for Communication-Efficient Distributed Learning
HTML
PDF
Tianyi Chen, Yuejiao Sun, Wotao Yin
TL;DR
该论文旨在通过LASG(一种新的随机梯度下降方法)解决通信效率低下的分布式机器学习问题,并在实验中展示可以节省大量的通信成本。
Abstract
This paper targets solving
distributed machine learning
problems such as
federated learning
in a
communication-efficient
fashion. A class
→