BriefGPT.xyz
Jul, 2021
分散式联邦学习:平衡通讯与计算成本
Decentralized Federated Learning: Balancing Communication and Computing Costs
HTML
PDF
Wei Liu, Li Chen, Wenyi Zhang
TL;DR
提出一种采用分布式训练(DFL)的通用分散式最随机梯度下降(SGD)框架,它可以解决在多个节点中进行通信和本地更新的平衡,具有压缩通信和强收敛保证的特点。
Abstract
decentralized federated learning
(DFL) is a powerful framework of distributed machine learning and decentralized
stochastic gradient descent
(SGD) is a driving engine for DFL. The performance of decentralized SGD
→