BriefGPT.xyz
Mar, 2025
为多任务学习注入不平衡敏感性
Injecting Imbalance Sensitivity for Multi-Task Learning
HTML
PDF
Zhipeng Zhou, Liu Liu, Peilin Zhao, Wei Gong
TL;DR
本研究解决了多任务学习中存在的任务不平衡问题,并提出通过施加约束的方式增强现有基线方法的有效性。关键的创新在于引入了不平衡敏感的梯度下降(IMGrad)方法,实验结果显示,在多个主流的多任务学习基准上表现优异,具有重要的实际应用价值。
Abstract
Multi-task Learning
(MTL) has emerged as a promising approach for deploying
Deep Learning
models in real-life applications. Recent studies have proposed optimization-based learning paradigms to establish task-sha
→