BriefGPT.xyz
Jul, 2023
不记忆,模仿历史:无序列记忆联邦类迭代学习
Don't Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory
HTML
PDF
Sara Babakniya, Zalan Fabian, Chaoyang He, Mahdi Soltanolkotabi, Salman Avestimehr
TL;DR
本文提出了一个基于泛化模型的联邦分类增量学习框架,可以在没有直接访问过去数据的情况下通过合成先前分布的样本,来减少来自各个用户的对策略的类别过程。
Abstract
deep learning
models are prone to forgetting information learned in the past when trained on new data. This problem becomes even more pronounced in the context of
federated learning
(FL), where data is decentrali
→