BriefGPT.xyz
May, 2024
在客户端上进行修剪而不是服务器:在联邦学习中加速稀疏训练
Prune at the Clients, Not the Server: Accelerated Sparse Training in Federated Learning
HTML
PDF
Georg Meinhardt, Kai Yi, Laurent Condat, Peter Richtárik
TL;DR
该研究论文介绍了稀疏训练和加速通信在Federated Learning中的整合方法Sparse-ProxSkip,并在大量实验证明了其良好性能。
Abstract
In the recent paradigm of
federated learning
(FL), multiple clients train a shared model while keeping their local data private. Resource constraints of clients and
communication costs
pose major problems for tra
→