BriefGPT.xyz
Jun, 2021
利用随机单元增强预训练模型进行神经监督域自适应
Neural Supervised Domain Adaptation by Augmenting Pre-trained Models with Random Units
HTML
PDF
Sara Meftah, Nasredine Semmar, Youssef Tamaazousti, Hassane Essafi, Fatiha Sadat
TL;DR
本文提出一种针对使用标准微调的神经迁移学习在特定于目标域的模式学习方面存在限制的问题的解决方案,即向预训练模型中引入归一化、加权和随机初始化的单元,以更好地适应目标域。实验证明,该方法在自然语言处理中的四项任务中表现出显著的改进。
Abstract
neural transfer learning
(TL) is becoming ubiquitous in
natural language processing
(NLP), thanks to its high performance on many tasks, especially in low-resourced scenarios. Notably, TL is widely used for neura
→