BriefGPT.xyz
Oct, 2021
应用Renyi差分隐私方法进行超参数调优
Hyperparameter Tuning with Renyi Differential Privacy
HTML
PDF
Nicolas Papernot, Thomas Steinke
TL;DR
本文研究差分隐私算法包括DP-SGD等在进行多次训练来微调算法超参数时产生的隐私泄漏问题,并提出了基于Renyi差分隐私的超参数搜索方法,结果表明虽然调整超参数的确会带来隐私泄漏,但只要每次候选训练运行本身是差分隐私的,那么泄漏就是适度的。
Abstract
For many differentially private algorithms, such as the prominent
noisy stochastic gradient descent
(DP-SGD), the analysis needed to bound the
privacy leakage
of a single training run is well understood. However,
→