BriefGPT.xyz
Feb, 2023
跨设备联邦学习下行压缩
$\texttt{DoCoFL}$: Downlink Compression for Cross-Device Federated Learning
HTML
PDF
Ron Dorfman, Shay Vargaftik, Yaniv Ben-Itzhak, Kfir Y. Levy
TL;DR
本文针对跨设备联邦学习的下行(从参数服务器到客户端)通信压缩问题,提出了一种名为DoCoFL的新框架,结合了多种上行压缩方案,可实现双向压缩,显著减少带宽需求,并且准确率达到无压缩时的水平,与FedAvg相当。
Abstract
Many
compression
techniques have been proposed to reduce the communication overhead of
federated learning
training procedures. However, these are typically designed for compressing model updates, which are expect
→