BriefGPT.xyz
Nov, 2019
语义分割中的增量学习知识蒸馏
Knowledge Distillation for Incremental Learning in Semantic Segmentation
HTML
PDF
Umberto Michieli, Pietro Zanuttigh
TL;DR
本文着重介绍了如何应用知识蒸馏技术来处理增量学习的语义分割问题。经过在Pascal VOC2012和MSRC-v2数据集上的实验,该方法在多个增量学习场景中表现出显著的有效性。
Abstract
Although
deep learning
architectures have shown remarkable results in
scene understanding
problems, they exhibit a critical drop of overall performance due to catastrophic forgetting when they are required to inc
→