BriefGPT.xyz
Feb, 2020
带动量的随机梯度方法收敛于非光滑非凸优化问题
Convergence of a Stochastic Gradient Method with Momentum for Nonsmooth Nonconvex Optimization
HTML
PDF
Vien V. Mai, Mikael Johansson
TL;DR
本文介绍了一种随机子梯度方法,该方法结合了动量项,能够在一类广泛意义下的非光滑、非凸和受约束的优化问题中建立一个特殊的李亚普诺夫函数,实现快速收敛。
Abstract
stochastic gradient methods
with
momentum
are widely used in applications and at the core of optimization subroutines in many popular machine learning libraries. However, their sample complexities have never been
→