BriefGPT.xyz
Oct, 2022
具有样本大小无关性的随机约束 DRO
Stochastic Constrained DRO with a Complexity Independent of Sample Size
HTML
PDF
Qi Qi, Jiameng Lyu, Kung sik Chan, Er Wei Bai, Tianbao Yang
TL;DR
本文提出并分析了一种基于随机算法的方法,用于解决Kullback Leibler divergence约束的Distributionally Robust Optimization问题,该方法适用于非凸和凸损失函数,并具有更高的竞争性和更实用的常数批量大小迭代复杂度。
Abstract
distributionally robust optimization
(DRO), as a popular method to train robust models against distribution shift between training and test sets, has received tremendous attention in recent years. In this paper, we propose and analyze
→