BriefGPT.xyz
May, 2018
SGD算法下的非凸截断损失学习
Learning with Non-Convex Truncated Losses by SGD
HTML
PDF
Yi Xu, Shenghuo Zhu, Sen Yang, Chi Zhang, Rong Jin...
TL;DR
本论文研究了通过截断传统损失函数形成的一类客观函数,并探讨了使用截断损失函数进行非凸学习的方法,证明了外推风险边界和统计误差,建立了随机梯度下降算法找到的模糊最小值的统计误差,并通过实验证明该方法的有效性。
Abstract
Learning with a {\it
convex loss
} function has been a dominating paradigm for many years. It remains an interesting question how non-
convex loss
functions help improve the generalization of learning with broad ap
→