BriefGPT.xyz
Jul, 2013
MixedGrad: 一种 O(1/T) 收敛率算法用于随机平滑优化
MixedGrad: An O(1/T) Convergence Rate Algorithm for Stochastic Smooth Optimization
HTML
PDF
Mehrdad Mahdavi, Rong Jin
TL;DR
本文将提出一种新的混合优化方法用于平滑函数优化的同时,考虑了随机预测和完整梯度的优化,通过最多使用O(lnT)次完整梯度oracle和O(T)次随机oracle,可以实现O(1/T)的优化误差。
Abstract
It is well known that the optimal
convergence rate
for
stochastic optimization
of
smooth functions
is $O(1/\sqrt{T})$, which is same as
→