BriefGPT.xyz
Jun, 2014
通过无需调参的随机学习实现同时进行模型选择和优化
Simultaneous Model Selection and Optimization through Parameter-free Stochastic Learning
HTML
PDF
Francesco Orabona
TL;DR
该论文提出了一种基于核的随机梯度下降算法,在训练过程中进行模型选择,不需任何形式的交叉验证或参数调整,并利用在线学习理论在数据相关性方面进行正则化的估计,证明了标准光滑性假设下的最优收敛速度。
Abstract
stochastic gradient descent
algorithms for training linear and
kernel predictors
are gaining more and more importance, thanks to their scalability. While various methods have been proposed to speed up their conve
→