BriefGPT.xyz
Nov, 2024
通过非单调自适应缩放梯度权重增强DP-SGD
Enhancing DP-SGD through Non-monotonous Adaptive Scaling Gradient Weight
HTML
PDF
Tao Huang, Qingyu Huang, Xin Shi, Jiayang Meng, Guolong Zheng...
TL;DR
本研究针对传统差分隐私技术在梯度处理上对模型准确性的影响,提出了一种新方法DP-PSASC,通过非单调自适应缩放梯度替换传统的剪切,改善了对小梯度的重 weighting。研究表明,该方法在保持隐私保护的同时,提高了模型在多种数据集上的性能,具有较大的应用潜力。
Abstract
In the domain of
Deep Learning
, the challenge of protecting sensitive data while maintaining model utility is significant. Traditional
Differential Privacy
(DP) techniques such as Differentially Private Stochasti
→