BriefGPT.xyz
Jul, 2015
超越凸性:随机拟凸优化
Beyond Convexity: Stochastic Quasi-Convex Optimization
HTML
PDF
Elad Hazan, Kfir Y. Levy, Shai Shalev-Shwartz
TL;DR
本文研究随机版归一化梯度下降算法,并证明了该算法在优化拥有拟凸和局部Lipschitz性质的函数时,能够保证收敛到全局最优解。与标准的随机梯度下降算法不同的是,该算法要求使用最小的小批量大小,以避免梯度爆炸等问题。
Abstract
stochastic convex optimization
is a basic and well studied primitive in machine learning. It is well known that convex and Lipschitz functions can be minimized efficiently using Stochastic Gradient Descent (SGD). The
no
→