BriefGPT.xyz
Feb, 2019
随机凸优化中梯度变小的复杂性
The Complexity of Making the Gradient Small in Stochastic Convex Optimization
HTML
PDF
Dylan Foster, Ayush Sekhari, Ohad Shamir, Nathan Srebro, Karthik Sridharan...
TL;DR
本文探讨了在随机凸优化问题中寻找近似驻点的 oracle 复杂度问题和其和全局 oracle 模型的关系,并提出了一种扩展的递归正则化算法以实现接近最优率,并揭示了 stochastic optimization 和 learning 的复杂度之间的一些惊人区别。
Abstract
We give nearly matching upper and lower bounds on the
oracle complexity
of finding $\epsilon$-stationary points ($\| \nabla F(x) \| \leq\epsilon$) in
stochastic convex optimization
. We jointly analyze the
→