BriefGPT.xyz
Mar, 2024
FAGH:使用近似的全局Hessian加速联邦学习
FAGH: Accelerating Federated Learning with Approximated Global Hessian
HTML
PDF
Mrinmay Sen, A. K. Qin, Krishna Mohan C
TL;DR
本文提出了一种加速联邦学习训练的近似全局Hessian方法(FAGH),通过利用近似全局Hessian的曲率加速全局模型的收敛,从而减少通信轮次和训练时间,并在训练和测试损失以及测试准确率方面优于几种最先进的联邦学习训练方法。
Abstract
In
federated learning
(FL), the significant
communication overhead
due to the slow
convergence
speed of training the global model poses a
→