BriefGPT.xyz
Apr, 2019
随机梯度下降法在非凸目标函数中的收敛速率
Convergence rates for the stochastic gradient descent method for non-convex objective functions
HTML
PDF
Benjamin Fehrman, Benjamin Gess, Arnulf Jentzen
TL;DR
本文研究了随机梯度下降法在非全局凸函数的情况下,实现局部收敛和收敛速率的估计,尤其适用于机器学习中的简单目标函数。
Abstract
We prove the
local convergence
to minima and estimates on the
rate of convergence
for the
stochastic gradient descent
method in the case o
→