BriefGPT.xyz
Jun, 2023
联邦学习:你可能不必频繁通信!
Federated Learning You May Communicate Less Often!
HTML
PDF
Milad Sefidgaran, Romain Chor, Abdellatif Zaidi, Yijun Wan
TL;DR
本文研究了联邦学习模型的泛化误差,通过PAC-Bayes和速率失真理论对其进行了分析,发现在联邦学习中客户端和参数服务器通信的轮数会影响泛化误差,并且在应用于FSVM时,泛化误差随着轮数增加而增加,因此需要优化轮数以减小FL算法的总体风险。
Abstract
We investigate the
generalization error
of statistical learning models in a
federated learning
(FL) setting. Specifically, we study the evolution of the
→