BriefGPT.xyz
Aug, 2021
基于动态注意力的通信高效联邦学习
Dynamic Attention-based Communication-Efficient Federated Learning
HTML
PDF
Zihan Chen, Kai Fong Ernest Chong, Tony Q. S. Quek
TL;DR
本文提出了一种自适应训练算法AdaFL,该算法通过注意力机制和动态分数方法来平衡性能稳定性和通信效率,实验结果表明相对于FedAvg算法,AdaFL算法在模型精度、性能稳定性和通信效率三个方面都有很大的提高。
Abstract
federated learning
(FL) offers a solution to train a global machine learning model while still maintaining
data privacy
, without needing access to data stored locally at the clients. However, FL suffers performan
→