BriefGPT.xyz
May, 2018
贝叶斯神经网络中基于损失校准的近似推断
Loss-Calibrated Approximate Inference in Bayesian Neural Networks
HTML
PDF
Adam D. Cobb, Stephen J. Roberts, Yarin Gal
TL;DR
本文提出了一种基于Bayesian决策理论的新的、具有损失校准的证据下界来近似贝叶斯神经网络的真实后验分布,该下界依赖于一种效用函数,以确保我们的逼近方法对于具有非对称效用函数的应用程序具有更高的效用。此外,该方法具有与标准dropout神经网络相同的目标,但具有额外的依赖于效用函数的惩罚项。
Abstract
Current approaches in
approximate inference
for
bayesian neural networks
minimise the Kullback-Leibler divergence to approximate the true posterior over the weights. However, this approximation is without knowled
→