BriefGPT.xyz
Sep, 2024
约束速率的量化用于通信高效的联邦学习
Rate-Constrained Quantization for Communication-Efficient Federated Learning
HTML
PDF
Shayan Mohajer Hamidi, Ali Bereyhi
TL;DR
本研究解决了联邦学习中通信成本过高的问题,提出了一种名为RC-FED的新框架,该框架在量化梯度时考虑了保真度和数据速率约束。通过将量化失真最小化,同时保持编码梯度的速率低于目标阈值,显示出该方法在多个数据集上相较于基准量化联邦学习方案的优越性能。
Abstract
Quantization
is a common approach to mitigate the communication cost of
Federated Learning
(FL). In practice, the quantized local parameters are further encoded via an entropy coding technique, such as Huffman co
→