BriefGPT.xyz
Nov, 2023
无剪切偏差的差分隐私SGD:一种误差反馈方法
Differentially Private SGD Without Clipping Bias: An Error-Feedback Approach
HTML
PDF
Xinwei Zhang, Zhiqi Bu, Zhiwei Steven Wu, Mingyi Hong
TL;DR
我们提出了一种新的误差反馈(EF)DP算法作为DPSGD-GC的替代方案,它不仅提供了逐渐减小的效用界限而且不引入恒定的剪裁偏差,更重要的是,它允许独立于问题进行剪裁阈值的任意选择。
Abstract
Differentially Private Stochastic Gradient Descent with
gradient clipping
(
dpsgd-gc
) is a powerful tool for training deep learning models using sensitive data, providing both a solid theoretical privacy guarantee
→