BriefGPT.xyz
May, 2016
一种用于机器学习的多批L-BFGS方法
A Multi-Batch L-BFGS Method for Machine Learning
HTML
PDF
Albert S. Berahas, Jorge Nocedal, Martin Takáč
TL;DR
本文研究了一种使用二阶信息和批处理方法进行并行优化的新算法,在多批处理情况下,实现了稳定的拟牛顿更新,并在分布式计算平台上完成了算法行为和收敛性质的研究。
Abstract
The question of how to parallelize the
stochastic gradient descent
(SGD) method has received much attention in the literature. In this paper, we focus instead on
batch methods
that use a sizeable fraction of the
→