BriefGPT.xyz
Dec, 2022
MicroBERT:低资源单语BERT的参数减少和多任务学习的有效训练
MicroBERT: Effective Training of Low-resource Monolingual BERTs through Parameter Reduction and Multitask Learning
HTML
PDF
Luke Gessler, Amir Zeldes
TL;DR
本研究探讨了两种技术,以在低资源设置中训练单语言TLM,结果表明 MicroBERT 模型能够对下游任务评估进行显著改善。
Abstract
transformer language models
(TLMs) are critical for most NLP tasks, but they are difficult to create for
low-resource languages
because of how much pretraining data they require. In this work, we investigate two
→