BriefGPT.xyz
Feb, 2016
无对偶、正则化和单独凸性的SDCA
SDCA without Duality, Regularization, and Individual Convexity
HTML
PDF
Shai Shalev-Shwartz
TL;DR
提出了改进型的随机对偶坐标上升方法,无需显式正则化,无需依赖对偶性,甚至对于非凸损失函数,只要期望损失函数是强凸的,就可以证明收敛率是线性的。
Abstract
stochastic dual coordinate ascent
is a popular method for solving
regularized loss minimization
for the case of convex losses. We describe variants of SDCA that do not require explicit regularization and do not r
→