stochastic gradient algorithms estimate the gradient based on only one or a
few samples and enjoy low computational cost per iteration. They have been
widely used in large-scale optimization problems. However, stochastic gradient
algorithms are usually slow to converge and achieve sub-
CheapSVRG is proposed as a new stochastic variance-reduction optimization scheme which achieves a linear convergence rate through a surrogate computation while also balancing computational complexity.