BriefGPT.xyz
Jun, 2022
多语言联邦学习预训练模型
Pretrained Models for Multilingual Federated Learning
HTML
PDF
Orion Weller, Marc Marone, Vladimir Braverman, Dawn Lawrie, Benjamin Van Durme
TL;DR
本文揭示了多语言对联邦学习的影响,从语言建模、机器翻译、文本分类三个任务中选用联邦算法和非联邦算法进行对比,结果表明预训练模型可以减少联邦学习的负面影响,使其表现接近或优于中心化(无隐私)学习,即使使用非独立同分布分区。
Abstract
Since the advent of
federated learning
(FL), research has applied these methods to
natural language processing
(NLP) tasks. Despite a plethora of papers in FL for NLP, no previous works have studied how
→