BriefGPT.xyz
Mar, 2013
边缘、收缩和提升
Margins, Shrinkage, and Boosting
HTML
PDF
Matus Telgarsky
TL;DR
本文研究表明,通过固定小常数缩放步长选择,AdaBoost及其直接变体可以产生近似的最大间隔分类器,同时对于梯度提升的优化过程提供了保证和提高了保证的方法。这些结果适用于指数损失和类似损失,尤其是逻辑损失。
Abstract
This manuscript shows that
adaboost
and its immediate variants can produce approximate
maximum margin classifiers
simply by scaling step size choices with a fixed small constant. In this way, when the unscaled st
→