BriefGPT.xyz
Oct, 2022
SAGDA: 在联邦式Min-Max学习中实现O(ε^{-2})通信复杂度
SAGDA: Achieving $\mathcal{O}(ε^{-2})$ Communication Complexity in Federated Min-Max Learning
HTML
PDF
Haibo Yang, Zhuqing Liu, Xin Zhang, Jia Liu
TL;DR
本文提出了一种名为SAGDA的新算法框架,用于降低联邦min-max学习的通信复杂度,并在此基础上提高了对标准FSGDA方法通信复杂度的理解。
Abstract
To lower the
communication complexity
of federated
min-max learning
, a natural approach is to utilize the idea of infrequent communications (through multiple local updates) same as in conventional
→