BriefGPT.xyz
Oct, 2023
高效不确定性量化: 简易重采样随机梯度下降
Resampling Stochastic Gradient Descent Cheaply for Efficient Uncertainty Quantification
HTML
PDF
Henry Lam, Zitong Wang
TL;DR
通过采用分别基于重采样的多个随机梯度下降和在线方法,我们实现了对于随机梯度下降解的置信区间的构建,通过最近被称为廉价引导思想和SGD的Berry-Esseen型界限,我们显著减少了计算量,并绕过了现有分批方法中复杂的混合条件。
Abstract
stochastic gradient descent
(SGD) or stochastic approximation has been widely used in model training and stochastic optimization. While there is a huge literature on analyzing its convergence,
inference
on the ob
→