BriefGPT.xyz
Feb, 2021
使用拉普拉斯正则化的联邦多任务学习的新视角和收敛速率
FedU: A Unified Framework for Federated Multi-Task Learning with Laplacian Regularization
HTML
PDF
Canh T. Dinh, Tung T. Vu, Nguyen H. Tran, Minh N. Dao, Hongyu Zhang
TL;DR
本论文提出了两种算法FedU和dFedU,解决了非独立同分布数据分布导致联邦学习性能下降的问题,并在实验中表现出比现有算法更好的收敛速度和性能,同时指出所提出的联邦多任务学习问题可以用于常规FL和个性化FL任务。
Abstract
federated multi-task learning
(FMTL) has emerged as a natural choice to capture the statistical diversity among the clients in
federated learning
. To unleash the potential of FMTL beyond statistical diversity, we
→