BriefGPT.xyz
Mar, 2016
减小方差以实现更快的非凸优化
Variance Reduction for Faster Non-Convex Optimization
HTML
PDF
Zeyuan Allen-Zhu, Elad Hazan
TL;DR
本篇论文研究了非凸优化中高效到达稳定点的基本问题,并利用方差缩减技巧和适用于非凸优化的全新方差缩减分析提出一种首个非凸优化的一阶小批量随机算法,并在非凸损失函数和神经网络训练中表现出了有效性。
Abstract
We consider the fundamental problem in
non-convex optimization
of efficiently reaching a stationary point. In contrast to the convex case, in the long history of this basic problem, the only known theoretical results on first-order
→