BriefGPT.xyz
Mar, 2021
ANITA: 一种优化的无循环加速方差减少梯度法
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
HTML
PDF
Zhize Li
TL;DR
本文提出了一种名为 ANITA 的新型加速梯度方法,可用于解决基本的有限和优化问题,并在确定通常的凸和强凸设置中比之前的最先进结果更好。
Abstract
We propose a novel accelerated variance-reduced gradient method called
anita
for finite-sum
optimization
. In this paper, we consider both general convex and strongly convex settings. In the general convex setting
→