AbstractWe introduce a novel and general loss function, called Symmetric Contrastive (Sy-CON) loss, for effective
continual self-supervised learning (CSSL). We first argue that the conventional loss form of continual learning which consists of single task-specific loss (for
→