BriefGPT.xyz
Dec, 2020
联邦非凸稀疏学习
Federated Nonconvex Sparse Learning
HTML
PDF
Qianqian Tong, Guannan Liang, Tan Zhu, Jinbo Bi
TL;DR
本文提出了两种在联邦学习背景下的非凸稀疏学习迭代算法:Fed-HT和FedIter-HT,证明了它们都具有线性收敛速度和类似于传统IHT方法的优秀稀疏估计保证,同时具有分散的非IID数据。实验结果表明,Fed-HT和FedIter-HT算法在通信轮次和带宽方面都比其竞争对手(分布式IHT)表现优异。
Abstract
nonconvex sparse learning
plays an essential role in many areas, such as signal processing and deep network compression.
iterative hard thresholding
(IHT) methods are the state-of-the-art for
→