BriefGPT.xyz
Mar, 2022
PERT: 使用排列语言模型对BERT进行预训练
PERT: Pre-training BERT with Permuted Language Model
HTML
PDF
Yiming Cui, Ziqing Yang, Ting Liu
TL;DR
本文提出了一种新的预训练语言模型(PERT),它是一种自动编码模型,使用排列语言模型技术进行训练,并应用全词遮盖和N-gram 遮盖以提高其性能。实验结果表明,PERT可以在某些任务上比其他相似模型有更好的性能表现。
Abstract
pre-trained language models
(PLMs) have been widely used in various
natural language processing
(NLP) tasks, owing to their powerful text representations trained on large-scale corpora. In this paper, we propose
→