BriefGPT.xyz
Aug, 2024
最优弱到强学习的多面性
The Many Faces of Optimal Weak-to-Strong Learning
HTML
PDF
Mikael Møller Høgsgaard, Kasper Green Larsen, Markus Engelund Mathiasen
TL;DR
本文解决了样本复杂度优化的缺口,提出了一种新颖且简单的Boosting算法,证明了其样本复杂度是最优的。该算法将训练数据分为五个相等的部分,分别运行AdaBoost,并通过多数投票结合结果,初步实验证明在大数据集上可能优于以往算法。
Abstract
Boosting
is an extremely successful idea, allowing one to combine multiple low accuracy classifiers into a much more accurate voting classifier. In this work, we present a new and surprisingly simple
Boosting
alg
→