BriefGPT.xyz
Jan, 2019
SVRG和Katyusha算法不需要外循环也能更优秀
Don't Jump Through Hoops and Remove Those Loops: SVRG and Katyusha are Better Without the Outer Loop
HTML
PDF
Dmitry Kovalev, Samuel Horvath, Peter Richtarik
TL;DR
本文设计了随机方差减少梯度方法(SVRG)和其加速变体(Katyusha)的无循环变体,证明了其在理论上具有与原始方法相同的卓越收敛性能,通过数值实验证明了这些方法在实践中具有显着优越性。
Abstract
The
stochastic variance-reduced
gradient
method
(SVRG) and its accelerated variant (Katyusha) have attracted enormous attention in the machine learning community in the last few years due to their superior theore
→