BriefGPT.xyz
Sep, 2012
用于正则化损失最小化的随机对偶坐标上升方法
Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
HTML
PDF
Shai Shalev-Shwartz, Tong Zhang
TL;DR
本研究介绍了随机对偶坐标上升法(SDCA)的新分析,证明了这类方法具有与随机梯度下降法(SGD)相当或更好的理论保证,从而证明了SDCA在实际应用中的有效性。
Abstract
stochastic gradient descent
(SGD) has become popular for solving large scale
supervised machine learning
optimization problems such as SVM, due to their strong theoretical guarantees. While the closely related
→