We show that the exponential convergence rate of stochastic gradient descent for smooth strongly convex objectives can be markedly improved by perturbing the row selection rule in the direction of sampling estimates proportionally to the Lipschitz constants of their gradients. That is,