BriefGPT.xyz
Nov, 2016
具有线性收敛速率的循环增量方法的可证明优越性超越梯度下降
Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
HTML
PDF
Aryan Mokhtari, Mert Gürbüzbalaban, Alejandro Ribeiro
TL;DR
该论文介绍了一种名为Double Incremental Aggregated Gradient (DIAG)的优化方法,可以在大规模机器学习问题中应用,并证明了它的收敛速度优于Gradient Descent方法。
Abstract
Recently, there has been growing interest in developing
optimization
methods for solving large-scale
machine learning
problems. Most of these problems boil down to the problem of minimizing an average of a finite
→