BriefGPT.xyz
Nov, 2024
通过综合知识蒸馏实现个性化联邦学习
Towards Personalized Federated Learning via Comprehensive Knowledge Distillation
HTML
PDF
Pengju Wang, Bochao Liu, Weijia Guo, Yong Li, Shiming Ge
TL;DR
本研究针对个性化联邦学习中数据异质性导致的灾难性遗忘问题,提出了一种新的方法,通过综合利用全局和历史模型进行知识蒸馏,提升本地模型的个性化能力与泛化能力。实验结果表明,该方法在防止灾难性遗忘的同时,显著提高了个性化模型的整体性能。
Abstract
Federated Learning
is a distributed
Machine Learning
paradigm designed to protect data privacy. However, data heterogeneity across various clients results in
→