BriefGPT.xyz
Nov, 2018
随机梯度 Langevin 动力学的优缺点
The promises and pitfalls of Stochastic Gradient Langevin Dynamics
HTML
PDF
Nicolas Brosse, Alain Durmus, Eric Moulines
TL;DR
本文研究了基于大规模数据集的贝叶斯学习的关键MCMC算法,发现当前常用的SGLD算法存在问题,但通过引入控制变量后的SGLD Fixed Point算法可以有效改善该问题,且与Langevin Monte Carlo算法计算成本相比更低,可为该类应用提供参考。
Abstract
stochastic gradient langevin dynamics
(SGLD) has emerged as a key MCMC algorithm for
bayesian learning
from
large scale datasets
. While SG
→