BriefGPT.xyz
Dec, 2013
半随机梯度下降方法
Semi-Stochastic Gradient Descent Methods
HTML
PDF
Jakub Konečný, Peter Richtárik
TL;DR
本文提出了一种名为S2GD的新方法,为解决优化大量平滑凸损失函数的平均最小化问题,文章指出其期望工作量可以通过几个步骤推导得出。
Abstract
In this paper we study the problem of minimizing the average of a large number ($n$) of smooth
convex loss functions
. We propose a new method, S2GD (
semi-stochastic gradient descent
), which runs for one or severa
→