BriefGPT.xyz
Mar, 2022
非独立同分布联邦学习中通过零数据知识蒸馏微调全局模型
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated Learning
HTML
PDF
Lin Zhang, Li Shen, Liang Ding, Dacheng Tao, Ling-Yu Duan
TL;DR
该研究提出了一种数据无关的知识蒸馏方法,通过生成器探索本地模型的输入空间,并将本地模型的知识传递到全局模型中。实验结果表明,该方法在解决联邦学习中的数据异质性问题及提升模型性能方面,优于现有的联邦学习算法。
Abstract
federated learning
(FL) is an emerging distributed learning paradigm under privacy constraint.
data heterogeneity
is one of the main challenges in FL, which results in slow convergence and degraded performance. M
→