BriefGPT.xyz
Feb, 2018
SGD和Hogwild!:在无需有界梯度假设的情况下收敛
SGD and Hogwild! Convergence Without the Bounded Gradients Assumption
HTML
PDF
Lam M. Nguyen, Phuong Ha Nguyen, Marten van Dijk, Peter Richtárik, Katya Scheinberg...
TL;DR
该研究论文讨论了随机梯度下降算法的收敛性分析,提出了一种在异步并行环境下使用降低学习率机制的算法,并证明了其收敛性。
Abstract
stochastic gradient descent
(SGD) is the optimization algorithm of choice in many
machine learning
applications such as regularized empirical risk minimization and training deep neural networks. The classical ana
→