Jan, 2024
超越反遗忘:利用正向迁移进行多模式连续指导调整
Beyond Anti-Forgetting: Multimodal Continual Instruction Tuning with
Positive Forward Transfer
TL;DRMultimodal Continual Instruction Tuning (MCIT) enables Multimodal Large Language Models (MLLMs) to meet continuously emerging requirements without expensive retraining by addressing the issues of catastrophic forgetting and negative forward transfer using the Fwd-Prompt method.