BriefGPT.xyz
Feb, 2024
偏见自适应随机逼近的非渐近分析
Non-asymptotic Analysis of Biased Adaptive Stochastic Approximation
HTML
PDF
Sobihan Surendran, Antoine Godichon-Baggioni, Adeline Fermanian, Sylvain Le Corff
TL;DR
本研究通过非渐进性分析,探讨具有偏倚梯度和自适应步长的随机梯度下降算法,包括时间依赖的偏倚和梯度估计器的均方误差控制,结果表明带偏倚梯度的Adagrad和RMSProp算法收敛速率与无偏情况下的结果相似,实验结果进一步验证了收敛性,并展示了通过适当的超参数调整可以减少偏倚影响的能力。
Abstract
stochastic gradient descent
(SGD) with
adaptive steps
is now widely used for training deep neural networks. Most theoretical results assume access to unbiased gradient estimators, which is not the case in several
→