BriefGPT.xyz
Jun, 2021
凸优化的私有自适应梯度方法
Private Adaptive Gradient Methods for Convex Optimization
HTML
PDF
Hilal Asi, John Duchi, Alireza Fallah, Omid Javidbakht, Kunal Talwar
TL;DR
本研究探讨了差分隐私凸优化中的自适应算法,通过实现不同差分隐私变量Stochastic Gradient Descent(SGD)算法和Adagrad算法的私有版本,证明了我们的私有版本的Adagrad优于自适应SGD,而这又优于传统的SGD。我们提供了两种算法的后悔上界,并表明这些上限是最优的。
Abstract
We study adaptive methods for differentially private
convex optimization
, proposing and analyzing differentially private variants of a
stochastic gradient descent
(SGD) algorithm with adaptive stepsizes, as well
→