BriefGPT.xyz
Aug, 2021
基于知识蒸馏的通信高效联邦学习(FedKD)
FedKD: Communication Efficient Federated Learning via Knowledge Distillation
HTML
PDF
Chuhan Wu, Fangzhao Wu, Ruixuan Liu, Lingjuan Lyu, Yongfeng Huang...
TL;DR
本研究提出了一种基于知识蒸馏的通信高效的联邦学习方法,通过在客户端上互相学习一个学生模型和一个教师模型,只共享学生模型以降低通信成本,并提出了一种基于奇异值分解的动态梯度逼近方法来进一步降低通信成本。实验表明,该方法能够有效减少通信成本并取得竞争性的结果。
Abstract
federated learning
is widely used to learn intelligent models from decentralized data. In
federated learning
, clients need to communicate their local model updates in each iteration of model learning. However, mo
→