BriefGPT.xyz
Oct, 2022
带缓冲异步聚合的联邦学习中的无限梯度
Unbounded Gradients in Federated Learning with Buffered Asynchronous Aggregation
HTML
PDF
Mohammad Taha Toghani, César A. Uribe
TL;DR
本文通过移除梯度范数有界的假设,对异步联邦学习算法FedBuff进行了理论分析,研究了数据异构性、批次大小和延迟等因素对算法收敛速度的影响,从而提高跨设备联邦学习的可扩展性。
Abstract
synchronous updates
may compromise the efficiency of
cross-device federated learning
once the number of active clients increases. The \textit{FedBuff} algorithm (Nguyen et al., 2022) alleviates this problem by al
→