BriefGPT.xyz
Apr, 2019
随机梯度下降对于配对学习的稳定性和优化误差分析
Stability and Optimization Error of Stochastic Gradient Descent for Pairwise Learning
HTML
PDF
Wei Shen, Zhenhuan Yang, Yiming Ying, Xiaoming Yuan
TL;DR
研究了随机梯度下降优化算法在成对学习中稳定性与其与优化误差的权衡,并证明了成对学习的凸性、强凸性和非凸性稳定结果,并由此得出推广区间,同时得到了SGD算法的优化误差和预期风险的下限。
Abstract
In this paper we study the
stability
and its trade-off with
optimization error
for
stochastic gradient descent
(SGD) algorithms in the
→