BriefGPT.xyz
Jun, 2015
加速随机梯度下降求解有限和最小化
Accelerated Stochastic Gradient Descent for Minimizing Finite Sums
HTML
PDF
Atsushi Nitanda
TL;DR
本论文提出了一种优化方法,该方法融合了加速梯度下降、随机方差减少梯度的优点,适用于非强凸和强凸问题,并在效率和收敛速率上都有优异表现。
Abstract
We propose an
optimization
method for minimizing the finite sums of smooth
convex functions
. Our method incorporates an accelerated
gradient desc
→