Despite their effectiveness in a wide range of tasks, deep architectures suffer from some important limitations. In particular, they are vulnerable to catastrophic forgetting, i.e. they perform poorly when they a
本文提出了一种新的方法Inherit with Distillation and Evolve with Contrast (IDEC),它通过Dense Knowledge Distillation on all Aspects (DADA)和Asymmetric Region-wise Contrastive Learning (ARCL)模块解决了类增量语义分割中遇到的灾难性遗忘问题和语义漂移问题,并在多个CISS任务中展现出卓越的性能。