BriefGPT.xyz
Sep, 2024
HYDRA-FL: 针对强健和精确的联邦学习的混合知识蒸馏
HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning
HTML
PDF
Momin Ahmad Khan, Yasra Chandio, Fatima Muhammad Anwar
TL;DR
本研究解决了联邦学习中用户数据异构性导致的全球模型性能降低的问题。通过实证研究揭示基于知识蒸馏的联邦学习系统的脆弱性,并提出了一种新型的混合知识蒸馏算法HYDRA-FL,能够有效减少模型中毒攻击的影响,同时在正常情况下保持相似的性能表现。
Abstract
Data heterogeneity among
Federated Learning
(FL) users poses a significant challenge, resulting in reduced global model performance. The community has designed various techniques to tackle this issue, among which
Knowle
→