BriefGPT.xyz
Jul, 2021
批次逆变权重:深度异方差回归
Batch Inverse-Variance Weighting: Deep Heteroscedastic Regression
HTML
PDF
Vincent Mai, Waleed Khamies, Liam Paull
TL;DR
本文介绍了一种基于逆方差权重的均方误差适应于神经网络的参数优化方法,同时提出了Batch Inverse-Variance(BIV)作为一种损失函数用来控制学习率,并展示了BIV相比L2 loss、逆方差加权和过滤式基准方法在两个噪声数据集上显著提高了神经网络的性能。
Abstract
heteroscedastic regression
is the task of
supervised learning
where each label is subject to noise from a different distribution. This noise can be caused by the
→