BriefGPT.xyz
Aug, 2015
异步分布式半随机梯度优化
Fast Distributed Asynchronous SGD with Variance Reduction
HTML
PDF
Ruiliang Zhang, Shuai Zheng, James T. Kwok
TL;DR
本文提出了一种基于异步随机梯度下降的快速分布式机器学习算法,采用变量规约技术,可使用常量的学习率,并保证线性收敛到最优解,在Google云计算平台上的实验表明,该算法在墙时钟时间和解的质量方面优于最先进的分布式异步算法。
Abstract
With the recent proliferation of large-scale learning problems, there have been a lot of interest on
distributed machine learning algorithms
, particularly those that are based on
stochastic gradient descent
(SGD)
→