BriefGPT.xyz
Oct, 2021
DAdaQuant:用于通信高效联邦学习的双重自适应量化
DAdaQuant: Doubly-adaptive quantization for communication-efficient Federated Learning
HTML
PDF
Robert Hönig, Yiren Zhao, Robert Mullins
TL;DR
本研究介绍了一种动态自适应的量化算法DAdaQuant,可以在保证模型质量的前提下提高client到server的压缩比例,比强非自适应基线算法提高多达2.8倍。
Abstract
federated learning
(FL) is a powerful technique for training a model on a server with data from several clients in a
privacy-preserving
manner. In FL, a server sends the model to every client, who then train the
→