BriefGPT.xyz
Mar, 2021
联邦学习与元学习中的收敛与准确性权衡
Convergence and Accuracy Trade-Offs in Federated Learning and Meta-Learning
HTML
PDF
Zachary Charles, Jakub Konečný
TL;DR
研究一种称为“本地更新方法”的算法族,它可以泛化许多联邦和元学习算法,并证明对于二次模型,本地更新方法等价于对我们精确表征的代理损失进行一阶优化。
Abstract
We study a family of algorithms, which we refer to as
local update methods
, generalizing many federated and meta-learning algorithms. We prove that for
quadratic models
,
→