TL;DR该论文提出通过使用两个CNN模型互相学习的 Noisy Concurrent Training 方法,以在标签噪音的情况下避免模型记忆随机标签并提高泛化性能,并运用目标可变性技术进行正则化。在仿真和真实世界的嘈杂数据集上展示了该方法的有效性。
Abstract
deep neural networks (DNNs) fail to learn effectively under label noise and have been shown to memorize random labels which affect their generalization performance. We consider learning in isolation, using one-ho