Mher Safaryan, Rustem Islamov, Xun Qian, Peter Richtárik
TL;DR本研究提出了一族Federated Newton Learn方法,它不仅能够使用于广义线性模型,还可应用于压缩本地Hessians等通用收缩压缩算子,具有隐私增强和通信效率等优点,并以实验证明了其与关键基线相比具有卓越的通信复杂度。
Abstract
Inspired by recent work of Islamov et al (2021), we propose a family of Federated Newton Learn (FedNL) methods, which we believe is a marked step in the direction of making second-order methods applicable to FL. In contrast to the aforementioned work, FedNL employs a different