BriefGPT.xyz
Oct, 2021
动态差分隐私保护随机梯度下降
Dynamic Differential-Privacy Preserving SGD
HTML
PDF
Jian Du, Song Li, Moran Feng, Siheng Chen
TL;DR
本文提出了动态DP-SGD算法,通过动态调整剪裁阈值和噪声幅度来降低性能损失,同时保持隐私,从而显著提高了模型的准确性。
Abstract
differentially-private stochastic gradient descent
(DP-SGD) prevents training-data
privacy
breaches by adding noise to the clipped gradient during SGD training to satisfy the differential
→